• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Didi you notice that the 3D mark test was done with 285K?
If that makes any difference...
In Speedway? No, it makes absolutely no difference.
 
Exactly. 3DMark is a mess for this stupid reason and that's why I asked, because honestly the numbers are totally meaningless without stock results to compare to.

Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
On the 3dmark website in the search section, you can enter the standard clock values of gpu and gpu memory clock of the 7900xt or xtx and then compare them without overclocking.

5900X + 7900XTX (TBP 366+10%=402W, GPU clock 2620~2670MHz, VRAM 2600MHz)

View attachment 379325
What brand is your GPU? Sapphire?
 
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
 
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.
Perhaps. I've wondered where the die real estate went to, myself.
I do hope AMD is allowing disappointing leaks out, and that the final product is much more impressive, but... Even AMD themselves, are positioning RX 9070 (XT) as a replacement to the 7900GRE - XT 'range'.
1736479758346.png
^That slide alone, is what had me pull the trigger on a $900+ 24GB XTX, right after CES. AMD has no replacement tier card.

Funny enough, AMD seems to be ending the separation of RDNA 'Graphics' and CDNA 'Compute' the exact same way it started:
The RX 7900 XTX becomes the 'Radeon VII of Today'
and the
RX 9070 (XT) becomes the 'RX 5700 (XT) of Today'

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
Extraordinarily wishful thinking.
The situation has changed considerably since then. nVidia is a monster of a company, today -resources and all.
 
Last edited:
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
Perhaps by "improved RT" they meant they're giving us more RT cores? Or maybe the AI that does FSR 4 is taking up space? It could explain why some models need so much power. Personally, as long as it's a fine card for a decent price, I don't care.

  • AMD has explicitly stated in public that the “performance figures” are all wrong.
Where?

  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Why would they have done that?

Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
If the price is right, it very well may be.
 
Honestly couldn't give a rats tit about RT performance but if raster comes around the same as 7900XT, it would be pretty decent. I'm more interested in figuring out what they've done die wise because it looks strangely similar to two 9060XT's side by side. Are they upto some sort of modular arch or what i'm not sure but i want that die annotation..
 
Have you checked Cyberpunk, Indiana Jones or Alan Wake II with Full RT? It's a huge tanking in performance, yes, but it's beautiful.
I wouldn't expect the usual suspects here to admit that even if they had seen it tbh, it'd be easy enough to showcase some gorgeous differences and cherry pick those screenshots or video segments (like what was done to show how little difference it can make - which I don't deny depending on the game or scene), but I certainly wouldn't expect to convince those so vocally against it anyway, they've already made up their minds and appear to enjoy patting themselves on the back for it.

-------------------------------------------------------------------------------------------------------------

Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
  • A raster match~ish for a 7900XTX or 4080/S
  • RT performance that is generationally ahead of Ampere
  • FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
  • Some AIB cards have 2x HDMI 2.1
  • And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.

bring on the release and reviews.
 
Im talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).

View attachment 379240


It's grainier, blurrier:

View attachment 379241

Pick the RT shot -- it's the one on the left.
maybe my eyes are bad but its hardly any different imo
 
maybe my eyes are bad but its hardly any different imo
right - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
 
Im talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).

View attachment 379240


It's grainier, blurrier:

View attachment 379241

Pick the RT shot -- it's the one on the left.

RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
Oblivion_2025_01_10_12_40_40_802.jpg


Without HDR:
Oblivion_2025_01_10_12_43_08_786.jpg


I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
 
right - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
Exactly. No one denies that RT is nice. The problem is the performance cost even on Nvidia, and the fact that it isn't really a night and day difference, just a little icing on the cake. If you turn it on, you see it's nice. But then you turn it off and still enjoy your game just the same. After 5-10 minutes, you don't even care.
 
maybe my eyes are bad but its hardly any different imo
Someone posts screenshots chosen to demonstrate no/little difference;

Wow, there's hardly any difference! :rolleyes:
 
Last edited:
RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
View attachment 379337

Without HDR:View attachment 379338

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
That game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
 
When I got my mitts on the 8800GT

I was running an X800 XTPE at the time of oblivion so only pixel shader 2.0 when I first started playing it. Then got a 7800 GT which would allow me to run Pixel Shader 3.0. Yes, they were great graphics for that era but it wasn't until the 8800GTS 640mb when I was running Crysis in DX10 did I think that was the Pinnacle of graphics and for some time I might add.

I haven't read the whole thread guys, but the 9070 XT doesn't look too bad if they price it competitively.

Anybody got any idea of these cards pricing atm?
 
That game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
I agree. That game made me swap my amazing ATi X800 XT for an overheating, loud mess of a card known as 7800 GS AGP just to be able to play it with HDR. Good old times! :)

And we have people here saying that I don't care about features. Of course I do when they're good. The problem with features these days is that they either make your game run like a slideshow (RT), or make it a blurry mess (upscaling), not to mention manufacturers use them as excuses to pay more for cards that don't have any business being in the price range that they're in, which I find disgusting.
 
With HDR:
Honestly love this one with the colour saturation, sky highlight and seemingly deeper contrast. 6800Ultra was the first top GPU I ever bought and it was great to taste those visuals.

Pity for me, much like CP2077, it's just not quite my kind of game from an actual gameplay perspective.

For RT, clearly I'm an enjoyer but that doesn't mean I vouch for universally turning it on in every game, every situation and so on. But boy I've had times it has absolutely added to the visual immersion and to an extent - blown me away.

AMD's talk and posturing would seem to suggest they are focusing on it too, seeing's it's merit in customer attraction and a more rounded capable product. They already have a fairly dedicated crowd of people that are all about their cards, their bigger issue is getting the ones that don't currently use them, and perhaps haven't for a few generations, to come back/jump on.
 
Last edited:
RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
View attachment 379337

Without HDR:View attachment 379338

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!

Ah, that was DIrectX 9.0c. Shader Model 3.0 brought parity between DirectX on Windows and the Xbox 360's graphics capabilities, Oblivion's bloom shader was a fallback path for older DirectX 9.0b cards like the GeForce FX series that were about 3 years old when it came out. Oblivion really stretched the pre-unified shader GPUs to the max, and IMO it still looks stunning to this day. No official pricing info on the new Radeon cards either, I reckon it comes soon. That being said...

STOP RIGHT THERE, CRIMINAL SCUM. Nobody plays Oblivion with the nasty bloom shader on my watch. I'm confiscating your stolen goods. Now pay your fine, or it's off to jail.
 
Nobody plays Oblivion with the nasty bloom shader on my watch.

It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
 
It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/

It's true, though. In the beginning it was hardware transform and lighting, then HDR rendering effects, then GPU accelerated physics, tessellation, instancing, now raytracing... each generation of games has brought its own challenges to hardware and graphics drivers. RT is one of the most complex graphics techniques ever and widely considered to be the holy grail of computer graphics as it enables truly photorealistic scene generation, the problem is the simply ginormous amount of compute required to pull this off. NV uses AI as a crutch to achieve that goal, but true RT is probably within the next 5 GPU generations IMO.
 
It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
My favourite workaround to gain more performance was enabling HDR while disabling grass. Grass ate your GPU even harder than HDR, I'd say. Not to mention you could find your missed arrows a lot easier without it. :laugh:
 
I really wish that AMD could develop an open accelerator in AI and destroy the CUDA accelerator.
Don't sell only gimmicks. Proof that AMD!!
 
You literally said:

So which is it?
It's speculation. Nobody knows for sure.
It would be great if Navi44 was a 192-bit design with 12GB in it's full configuration. That would give us a 12GB XT model and maybe an 8GB cut down variant as the vanilla 9060.
Let's face it though, nobody outside of AMD and partners really knows yet, but the die-size leaks suggest that Navi44 is much smaller than Navi48, which doesn't make sense if it's supposed to have 75% of the hardware that Navi48 does. If the die-size leaks are accurate, Navi44 is somewhere between Navi 24 and Navi 23/33 - ie somewhere between the 6500XT and 6600-series in size.

Asus TUF leaks/rumours from WCCF, for example:
1736504758144.png


DigitalTrends:
1736504816068.png
 
Exactly. No one denies that RT is nice. The problem is the performance cost even on Nvidia, and the fact that it isn't really a night and day difference, just a little icing on the cake. If you turn it on, you see it's nice. But then you turn it off and still enjoy your game just the same. After 5-10 minutes, you don't even care.
This thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
 
This thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
Because it's not useless. If you want games with dynamic lighting, open worlds that also looks good you need RT in some way, shape or form.
It doesn't need to be full on Path Tracing, can be mixed with probes, or done in software like Lumen when it's tracing against SDF, or even AMD version of GI that also uses RT acceleration. It's either RT or we can go back to baked lighting or probes and we can stagnate and have the same looking games forever.
Shadows are the same story.

Sure many games just implement it just to say they have RT, and it's only a gimmick that can be turned off but that is on the devs of that game, not on the tech itself.
 
Back
Top