• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Unbelievable.. :kookoo:

Are you trolling?

you get 150% of performance and 150% of better software for windows operating systems with a nvidia 4090.
But you want to pay the same price as a slow radeon 7900XTX?

Please pick graphic cards with similar performance for price comparison. I think a nvidia 4070 or 4070 ti or if it exists 4070 ti super is the proper coutnerpart. (i do not really care for the lower nvidia parts)

Also pick prices on a proper date. That nivida 4090 was for ages around 1300 or 1500 € including tax and shipping in central europe what i remember. I think the 4090 is already end of life, which drives the prices up.
 
Are you trolling?

you get 150% of performance and 150% of better software for windows operating systems with a nvidia 4090.
But you want to pay the same price as a slow radeon 7900XTX?

Please pick graphic cards with similar performance for price comparison. I think a nvidia 4070 or 4070 ti or if it exists 4070 ti super is the proper coutnerpart. (i do not really care for the lower nvidia parts)
Judging by your comments, you are the one trolling.

Check your facts and come back afterwards.
 
Also FSR4 seem to be really good Hardware unboxed had a YT video of it and it's a huge leap compared to FSR3.1 like night and day difference and then I mean positively.
 
Also FSR4 seem to be really good Hardware unboxed had a YT video of it and it's a huge leap compared to FSR3.1 like night and day difference and then I mean positively.
I'm quite excited to try out FSR4 and DLSS4.

I really dislike both FSR3 and DLSS3, but if they have genuinely fixed the smearing, ghosting, and temporal blur that will be maybe enough to convert me. It's so hard to see how good or bad it is via captured, encoded, compressed YouTube though, so the only verdict I will trust is my own eyes but I do agree that the DF and HUB closeup footage of FSR4 seems very promising from here.

I'm going to grab the cheapest 16GB Nvidia and AMD cards (probably the 5070Ti and vanilla 9070) as soon as they're available for work reasons anyway, so I'll try both and see which one I prefer.
 
I'm quite excited to try out FSR4 and DLSS4.

I really dislike both FSR3 and DLSS3, but if they have genuinely fixed the smearing, ghosting, and temporal blur that will be maybe enough to convert me. It's so hard to see how good or bad it is via captured, encoded, compressed YouTube though, so the only verdict I will trust is my own eyes but I do agree that the DF and HUB closeup footage of FSR4 seems very promising from here.

I'm going to grab the cheapest 16GB Nvidia and AMD cards (probably the 5070Ti and vanilla 9070) as soon as they're available for work reasons anyway, so I'll try both and see which one I prefer.
you can see the difference here. Perfect probably not but a massive improvement? Sure as can be.
 
you can see the difference here. Perfect probably not but a massive improvement? Sure as can be.
Yeah, that's the HUB video I mentioned. The other one is DF:
 
you are the one trolling.
Well, 4070 Ti Super is pretty accurate. Sure, it's slower in most rasterised scenarios by a significant yet low margin (around 10 percent) and has way less VRAM but it wins at RT, especially if we talk heavy RT, and has more features. Also boasts more energy efficiency. That's why I deem it a fair GPU to call similar to the XTX.

4070 and 4070S surely are trolling. These are meant to compete with 7800 XT / 7900 GRE respectively.
I'm quite excited to try out FSR4 and DLSS4.
Highly doubt it's gonna be better enough to convert you. DLAA on top of a 1080p display (DSR / 1440p display if you can afford a xx70 Ti+ class GPU) still strikes me as THE way to play vidya. Upscaling is a great way to enjoy 4K gaming but idk, GPUs strong enough for DLSS Q / XeSS UQ / FSR Q are unobtanium for most gamers. And at Balanced and lower, it's usually less exciting than plain 1440p. And at 1440p and lower, upscaling is just one last resort. Good to have it as a fall-back option but bad if you need it.
 
Highly doubt it's gonna be better enough to convert you. DLAA on top of a 1080p display (DSR / 1440p display if you can afford a xx70 Ti+ class GPU) still strikes me as THE way to play vidya. Upscaling is a great way to enjoy 4K gaming but idk, GPUs strong enough for DLSS Q / XeSS UQ / FSR Q are unobtanium for most gamers. And at Balanced and lower, it's usually less exciting than plain 1440p. And at 1440p and lower, upscaling is just one last resort. Good to have it as a fall-back option but bad if you need it.
I game at 4K120 in one room (DLSS FG) and 1440p240 in the other (currently FSR FG with as little upscaling as possible, depends if the 7800XT has the grunt to run the game fast enough without FG assistance.

Since neither of my GPUs can do modern AAA games at high-refresh native res, I'm usually either dropping to 1080p120 on the Geforce and 1440p120 (with strobing) on the Radeon.

I don't want to use DLSS3 or FSR3 but FG is the carrot that makes me willing to keep trying it occasionally because FG is a great way to get the high-refresh, so long as the latency (1/2 FG'd framerate) is still over about 80fps.
 
I game at 4K120 in one room (DLSS FG) and 1440p240 in the other

Maybe you sell one room and buy a 5090 so you don't need upscaling..?

This one is interesting. I have only seen it on YT videos and not in the real life. My displays don't support this feature.
FG is a great way to get the high-refresh
I can clearly see that half frames are fake and this bugs me every time I enable it. Much worse on AMD. I can live with upscaling artifacts but not with this chicanery. No.

Still, I am interested to see if 9070 XT makes any difference. Feels like it's about two years too late to the party.
 
Regarding upscaling, don't know why fsr was ever compared to dlss since it's release, you can't compare software to hardware based upscaling, was always an unfair comparison. This is amd's first attempt at machine learning based upscaling.
 
Last edited:
I ran Speedway yesterday, and I scored 200 points less than the screenshot.

Bah. Now I need a new GPU :banghead:
 
Well, 4070 Ti Super is pretty accurate. Sure, it's slower in most rasterised scenarios by a significant yet low margin (around 10 percent) and has way less VRAM but it wins at RT, especially if we talk heavy RT, and has more features. Also boasts more energy efficiency. That's why I deem it a fair GPU to call similar to the XTX.
My opinion about RT was given on the first page of this thread. Feel free to go and have a look.
 
Okay, so I re-tested Speedway on my reference 7900XTX with stock clocks. This result is 6% (4 fps) higher than the old one, probably due to driver or app optimizations:

speedway.jpg

That would make the 9070XT possibly equal to the 7900XTX here, rather than faster (as was my previous approximation based on the old result). Accounting for the differences in alleged boost clocks and Ray Accelerator count, RDNA4 would show close to 25% improvement over the previous gen in this particular benchmark.

Naturally, these are still speculations on my part. Speedway is a hybrid RT implementation, and it's likely that in RT heavy games RDNA4 could show bigger improvement.
 
Maybe you sell one room and buy a 5090 so you don't need upscaling..?
I can have any hardware I want. I don't pay for it and even if I did, money is not a problem.

I took a 3090 out to put a 7800XT in because it was too hot and noisy, and I found all of the RT titles at the time underwhelming anyway, because I hate temporal blur and most RT implementations add a lot of it.

The other PC is running a 16GB 4060Ti because it performs fine when I choke it down to 150W. I'm hoping there's something faster at 150W in this coming round of GPUs from AMD and Nvidia.
 
Speedway is a hybrid RT implementation, and it's likely that in RT heavy games RDNA4 could show bigger improvement.
I believe so too

What was the GPU clock during your test?
From graph in the screenshot I would say around 2200-2300MHz?
And what was the TBP limit?

I'd place my bet on 5070 no doubt and 9070 seems capable of that, too.
5070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.

nVidia already show the 5070 reference at 250W and 9070 cant be that far behind the 9070XT, and most likely the latter reference one will start around 330W.


5060nonTi maybe will be around 150W
 
Last edited:
+1 to this, the raw hardware specs of the card don't seem like it should be able to hit the performance levels AMD is claiming.
Where did amd claim these levels. This is the problem because i specifically remembered a publication said amd themselves that the leaks are incorrect. Nowhere did they say their number was higher or lower. Ppl make false claims way too often.
 
5070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.
I know but you can manually tune them to eat no more than 150 and I'm sure as hell they both will destroy 4060 Ti at this power limit.
 
I know but you can manually tune them to eat no more than 150 and I'm sure as hell they both will destroy 4060 Ti at this power limit.
Sure, take (almost) any GPU and cut power by 40-45% and you loose "only" 20~25% performance.

An OC version of the same GPU a 40% decrease in power could mean even less performance drop, like 10-15%
 
Just posting Speedway for reference to a 4070Ti in case inquiring minds wanted to know.. also it is overclocked..

Screenshot 2025-01-10 090705.jpg

Pathetic :(
 
5070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.

nVidia already show the 5070 reference at 250W and 9070 cant be that far behind the 9070XT, and most likely the latter reference one will start around 330W.

https://www.techpowerup.com/gpu-specs/geforce-rtx-5070.c4218

5060nonTi maybe will be around 150W
50-series is on the same process node as 40-series, so I would expect most models to increase in TDP. My hope is that the 5060Ti isn't quite as hamstrung on memory bandwidth as the 4060Ti.

On the AMD side, there's a node shrink from 5+6nm to 4nm, so there's potential for some excellent undervolting there. The TDPs we're seeing at the moment seem to be factory OC models pushing >3GHz which is always going to be so far beyond the efficiency sweet spot that it's ridiculous. We don't know the configs of the 9060-series yet, nor the vanilla 9070 but I'm hoping that the 9070-series can be run at, say, ~2.4GHz at somewhere between 150 and 200W. I'm usually seeing 50% power draw for 80% clock speeds with RDNA2 and RDNA3....
 
5900X + 7900XTX (TBP 380W, VRAM 2600MHz)

View attachment 379432
Ok thanks a lot, I wanted to compare yours with the 9070xt.
You really have a good card.
From AMD's statements:"All these performance leaks, well, it is accurate for the way the driver performs on the card right now. It is nowhere near where the card will actually perform once we release the full performance driver.
Journalist: Did that also factor into your decision?
Azor: It’s not a readiness issue.
McAfee: We have in house the full performance driver. We intentionally chose to enable partners with a driver which exercises all of the, let’s call it, thermal-mechanical aspects of the card, without really running that risk of leaking performance on critical aspects of the of the product. That’s pretty standard practice".

It looks like it could perhaps match a 7900xtx
 
Back
Top