• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Unbelievable.. :kookoo:

Let's compare, cheapest AMD vs cheapest Nvidia:

View attachment 379245View attachment 379246
4090 is a different story - that's the absolute best card on the market, and those, nvidia can ask for (almost) whatever money they want, as there'll always be people who want the best, no matter what (also, it provides 23% better performance, so it's not like you pay twice as much to get the same performance)

but if you compare two cards from the same tier, you'll get the sameish pricing:

1736457107781.png
1736457125395.png
 
The 9060 series worries me with its 8GB of VRAM.

A fried of mine bought a 4060 on discount recently, not because it was a good card, but because he has an extremely cramped mITX system with only a 350W PSU and I figured the 115W, tiny, 4060 was his best bet, I could be wrong but it seemed to me to be the smallest, most power-efficient card around.

He's running a 2560x1080 display, so not even 1440p and yet two of the three games (Indiana Jones, DA: Veilguard, Space Marine II) he upgraded for require him to turn things down to avoid stuttering because of VRAM shortages.

So yeah, the 8GB 9060 cards need to be 20% cheaper than a 12GB B580. 8GB wasn't enough for more than 1080p in 2022, it sure as hell isn't any better in 2025. I'm just as worried about the 5060 but Nvidia will sell like hotcakes because people will probably just believe Jensen when he says something like "a 5060 for $349 matches a 4080. The more you buy the more you save" or some other hand-wavy nonsense. My experience of DLSS FG and RT on 40-series cards is that all those fake-framed RT effects need a boatload more VRAM to work than just the raster codepath, and if the 5060 only gets 8GB, that's not going to go down so well.

Where has there been any info on the 9060/9060xt vram capacity? I’d expect 12GB on a 192-bit bus for the 9060/9060xt and the 9050/9050xt to be cut down to 128-bit and have pci-e lanes shaved off again (>_>).

*It’s also confirmed from Paul at Paulshardware that the 9070 retains a 16gb config based on the powercolor product label on one of their two slot designs.

I fully expect both the 9060xt and 9060 to sport 12gb of vram at minimum.
 
The 9060 series worries me with its 8GB of VRAM.
is 8GB confirmed for 9060? 10GB/12GB would be a better fit, save 8GB for the entry level 9050 if they release a lower SKU and don't gimp it's PCIe lanes like the 6400/6500
 
Actually, all cards in the 3DMark Top 100 are heavily OC'd and use liquid cooling or better.

Those Top 100 scores are in no way representative of a particular model. They merely show best OCing samples.
Exactly. 3DMark is a mess for this stupid reason and that's why I asked, because honestly the numbers are totally meaningless without stock results to compare to.

Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
 
Maybe a comparison to 7900 XT would be more useful :confused:.
Yeah,
Normal people not running someMark3D every day or at all.
That number is like monopoly money in the cashier...
Mid range Nvidia+AMD card benchmarks numbers would make it a meaningful article.
 
is 8GB confirmed for 9060? 10GB/12GB would be a better fit, save 8GB for the entry level 9050 if they release a lower SKU and don't gimp it's PCIe lanes like the 6400/6500
8GB for the 9060 is not confirmed, no. The 9070 series has been announced, with unofficial tests/specs of the 9070XT that have leaked out before the review embargo confirming that it's a 256-bit, 16GB card like the 7800XT, and has 4096 cores that boost to around 3GHz.

There's precious little noise about the 9060-series, just rumours and speculation that they are coming in March 2025. Most of the rumours seem to show that Navi 44 powering the 9060 and 9060XT is exactly half of the Navi48 powering the 9070 and 9070XT - that means it's a 128-bit, 8GB card - most likely 2048 cores and cut down to 8 lanes of PCIe too. I would guess that if these rumours are accurate, the 9060XT will be using expensive double-density GDDR6 modules to boost the puny GPU to an acceptable quantity of VRAM, but also boost it's price into a higher, unacceptable price bracket for the performance it offers.

Hopefully the rumours are wrong and we do get a 192-bit 12GB 9060XT, as that's probably what the sub-$300 market needs more than anything else right now. B580 kind of nailed it there!
 
Last edited:
4090 is a different story - that's the absolute best card on the market, and those, nvidia can ask for (almost) whatever money they want, as there'll always be people who want the best, no matter what (also, it provides 23% better performance, so it's not like you pay twice as much to get the same performance)

but if you compare two cards from the same tier, you'll get the sameish pricing:

View attachment 379301 View attachment 379302
If it comes to games, the 7900XT fairs great, and still cheaper
1736458472648.png
1736458690786.png

The 7900XTX even beats the 4080 super while cheaper than the 4070 ti
I sorted for the cheapest on newegg
1736458585561.png
1736458783850.png
1736458722582.png
1736458819980.png


So yeah, the 4090 is king, but if you shoot lower than the top
AMD is was simply better in 2024.
And also yeah, in some specific games you may do better with the Nvidia, I talked in a general way
And if you would focus on AI stuff, Nvidia might be generally better, but that is not the average usecase
 
Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
Guru3D tests in Fire Strike Ultra, Time Spy, Steel Nomad, Port Royale and Full RT feature.
 
That’s what every generations 60 and 70 class gpus have been like from both brands since… forever. It’s not some new concept.
Man...thanks for enlightening me. /bow
 
Thanks for the sarcasm and adding nothing to the conversation *shrug
Thank you for sarcasm. This product is underwhelming on all counts. Just like Intel's recent GPU. *golf clap
 
I've jokingly hypothesized that the over-promising leaks that seem to preceed every single Radeon release are just misinformation from Nvidia fanboys
You think so ? Nah, couldn't be.
1736460356087.png

1736460380729.png
 
The 9060 series worries me with its 8GB of VRAM.

A fried of mine bought a 4060 on discount recently, not because it was a good card, but because he has an extremely cramped mITX system with only a 350W PSU and I figured the 115W, tiny, 4060 was his best bet, I could be wrong but it seemed to me to be the smallest, most power-efficient card around.

He's running a 2560x1080 display, so not even 1440p and yet two of the three games (Indiana Jones, DA: Veilguard, Space Marine II) he upgraded for require him to turn things down to avoid stuttering because of VRAM shortages.

So yeah, the 8GB 9060 cards need to be 20% cheaper than a 12GB B580. 8GB wasn't enough for more than 1080p in 2022, it sure as hell isn't any better in 2025. I'm just as worried about the 5060 but Nvidia will sell like hotcakes because people will probably just believe Jensen when he says something like "a 5060 for $349 matches a 4080. The more you buy the more you save" or some other hand-wavy nonsense. My experience of DLSS FG and RT on 40-series cards is that all those fake-framed RT effects need a boatload more VRAM to work than just the raster codepath, and if the 5060 only gets 8GB, that's not going to go down so well.
Could you send me a link where AMD says 9060 series will be 8GB?
 
This is beyond tiresome.

Its supposed to be a mid tier card and as others have said, even the Ngreedias 90's GPU have issues with RT even after all kinds of trickeries.

And as others have said, its a performance killer that provides nothing to gameplay.

All that AMD need to do is price this damned thing right and stop trying to be greedy.
Yeah, a $500 card can't match a $1,000 one in RT, only in raster. What an utter piece of...!
 
Interesting performance numbers. I am in the middle of parting together my newest gaming PC and it will have a flavour of this card in it (even with the stupid name). It'll be a decent upgrade to what I have now. I do wish we'd see more focus on power efficiency with GPUs and CPUs.
 
Didi you notice that the 3D mark test was done with 285K?
If that makes any difference...

5900X + 7900XTX (TBP 366W, GPU clock 2500~2530MHz, VRAM 2600MHz)

1736465271725.png
 
Seems like they really wanted to focus on two points that everybody was complaining about so FSR and RT. Now there is only one thing left, price.
And once everything aligns with the stars and planets under those 3 things, Nvidia can lower their prices and we can buy cheaper Nvidia cards!!!!!!!!!!! F yeaaaaaaaaa, screw AMD loooosers! :laugh:
 
5900X + 7900XTX (TBP 366+10%=402W, GPU clock 2620~2670MHz, VRAM 2600MHz)

1736466283533.png
 
I...can only say the following...TressFX.

Before you ask why I cite the two above comments, it's because ray tracing is about as stupid as TressFX was. It's "better" than the results you get from the other guy doing the same calculations...but completely forgets that 99.9% of games that exist now were made before ray tracing was adopted. You're more than welcome to claim you think it looks more realistic...and someone else is more than able to call it crap. Those are not debatable points, only opinions. The truth is that it's a computationally intensive process that doesn't result in linear or better improvements...and thus will be relegated to the dustbin of history exactly like TressFX. The only difference is that Nvidia has clung to their dead horse for longer because AMD has not competed with them, and thus it's always something they win at. It's always easiest to be the best when nobody else competes.

The only fact is the cost to performance numbers that this card will eventually have after a proper review...and hopefully it will be priced competitively. Yesterday's performance in RT, today's performance in raster, and yesterday's pricing would be a great boon. That's especially true when today's pricing is highway robbery, and yesterday's yesterday provides enough performance for most people today.
You seem to assume that TressFX merely died, and games just kept using the same old hair dynamics. That's not the case; the principles behind the tech merely got natively implemented in the game engine. Same thing happened with gameworks.
Hair Physics in Unreal Engine - Overview | Unreal Engine 5.5 Documentation | Epic Developer Community
here you can see a video of the tech in action:
Enabling Physics Simulation on Grooms in Unreal Engine | Unreal Engine 5.5 Documentation | Epic Developer Community

AMD is working on a loooot of technologies available for game devs who might not have an equivalent alternative in their engines just yet.
1736466722844.png
 
Could you send me a link where AMD says 9060 series will be 8GB?
Nope, because AMD haven't said that yet to the best of my knowledge.

This is what I said:
There's precious little noise about the 9060-series, just rumours and speculation that they are coming in March 2025. Most of the rumours seem to show that Navi 44 powering the 9060 and 9060XT is exactly half of the Navi48 powering the 9070 and 9070XT

Here's a few examples of the speculation so far:

One of the youtubers (might have been one of the Steves) speculated that the package size decrease from Navi 48 was too much for Navi 44 to be a 3072-core/192-bit/12GB config, and that it was even smaller than the 2048/128-bit/8GB config of the RX 7600 and 6600 series. There has been a node shrink, but if Navi 44 is even smaller than Navi 33 or 23, it's "likely to be a 2048 quad-channel solution" (GDDR6 channels are 32-bits wide, so presumably that means a 128-bit card).

Some redditors speculated that Navi48 was just two Navi44 dies glued together but that's been debunked now that we have actual die shots of Navi48.
 
Nope, because AMD haven't said that yet to the best of my knowledge.

This is what I said:


Here's a few examples of the speculation so far:

One of the youtubers (might have been one of the Steves) speculated that the package size decrease from Navi 48 was too much for Navi 44 to be a 3072-core/192-bit/12GB config, and that it was even smaller than the 2048/128-bit/8GB config of the RX 7600 and 6600 series. There has been a node shrink, but if Navi 44 is even smaller than Navi 33 or 23, it's "likely to be a 2048 quad-channel solution" (GDDR6 channels are 32-bits wide, so presumably that means a 128-bit card).

Some redditors speculated that Navi48 was just two Navi44 dies glued together but that's been debunked now that we have actual die shots of Navi48.
You literally said:
The 9060 series worries me with its 8GB of VRAM.
So which is it?
 
This is very good news, if true it points to only 5% performance difference in raster vs 4080S (+13% vs 4070Ti Super) but real world RT performance will be (in best case scenario) close but slower than 4070Ti Super (which anyway is good news imo)
The real question is if the reference model stays at 265W (that's what all the leakers was saying) or the board power increased in a last time decision by AMD?
Also if this is an OC model then the 3060MHz core speed is factory OC speed or manual OC? (because if you remember back in RX 7900 XTX launch in W1zzard's review of ASUS TUF (also probably ASUS TUF series the leaked RX 9070 XT, but it could be Prime also) the average clock of reference was 2630MHz and the ASUS model could hit 3200MHz with manual OC for a +15% real-world performance increase vs reference...)
But the most important thing is MSRP, with the right price any performance will be acceptable!
 
Last edited:
Back
Top