• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

It's speculation. Nobody knows for sure.
It would be great if Navi44 was a 192-bit design with 12GB in it's full configuration. That would give us a 12GB XT model and maybe an 8GB cut down variant as the vanilla 9060.
Let's face it though, nobody outside of AMD and partners really knows yet, but the die-size leaks suggest that Navi44 is much smaller than Navi48, which doesn't make sense if it's supposed to have 75% of the hardware that Navi48 does. If the die-size leaks are accurate, Navi44 is somewhere between Navi 24 and Navi 23/33 - ie somewhere between the 6500XT and 6600-series in size.

Asus TUF leaks/rumours from WCCF, for example:
View attachment 379365

DigitalTrends:
View attachment 379366
16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.

If you want games with dynamic lighting, open worlds that also looks good you need RT in some way, shape or form.
No you don't. Have you tried Alan Wake 2 without RT? It's basically indistinguishable from using RT, I'd say.

It's more prominent in Cyberpunk, but even there, all I see right away without pixel-peeping is shiny puddles. Not exactly a revolution in gaming tech.

Ok thanks a lot, I wanted to compare yours with the 9070xt.
You really have a good card.
From AMD's statements:"All these performance leaks, well, it is accurate for the way the driver performs on the card right now. It is nowhere near where the card will actually perform once we release the full performance driver.
Journalist: Did that also factor into your decision?
Azor: It’s not a readiness issue.
McAfee: We have in house the full performance driver. We intentionally chose to enable partners with a driver which exercises all of the, let’s call it, thermal-mechanical aspects of the card, without really running that risk of leaking performance on critical aspects of the of the product. That’s pretty standard practice".

It looks like it could perhaps match a 7900xtx
Good stuff. :) 14k vs 14.5k is already a match, if I dare to say (within a difference undetectable to the naked eye).

In other news: We might get something - a full release, or maybe just some more info, on 22 Jan.
 
16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.


No you don't. Have you tried Alan Wake 2 without RT? It's basically indistinguishable from using RT, I'd say.

It's more prominent in Cyberpunk, but even there, all I see right away without pixel-peeping is shiny puddles. Not exactly a revolution in gaming tech.


Good stuff. :) 14k vs 14.5k is already a match, if I dare to say (within a difference undetectable to the naked eye).

In other news: We might get something - a full release, or maybe just some more info, on 22 Jan.
It's still using RT, even when you have Path Tracing disabled. Just not per triangle, less accurate. They probably could use Hardware acceleration to make it look better without PT but then came Nvidia and probably offered them a deal.
 
16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.
I think you're misinterpreting that article:
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."

So the first part of that sentence is referencing the 'if the above leak is true' and refers to a 128-bit Navi44 in 8GB and 16GB flavours for the 9060 and 9060XT respectively.

The second part of that sentence "but the XT model might now get a wider, 192-bit interface" is referring to newer information that Navi44 might actually be a 192-bit die, overruling the "if the above leak is true" first part of that sentence.

I don't read that sentence and interpret it as "16GB 192-bit 9060XT"
I Interpret it as "leak says 8+16GB 128-bit cards, new rumours hint that it's actually a 192-bit design"

If Navi44 is actually a 192-bit design, then that doesn't rule out a die-harvested, cut-down version being sold as the vanilla 9060 with only 128-bits enabled for likely 8GB configs, but also possibly a 16GB config - if not for gamers then potentially for a Radeon Pro workstation variant.
 
I think you're misinterpreting that article:
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."

So the first part of that sentence is referencing the 'if the above leak is true' and refers to a 128-bit Navi44 in 8GB and 16GB flavours for the 9060 and 9060XT respectively.
It clearly says "and an RX 9060 XT with 16GB", then it speculates about a 192-bit bus, there's no way around it. You know it's wrong, I know it's wrong, but for the masses, it's just misinformation.

The sentence should have gone like
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 12 GB on a 192-bit interface."
 
16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.


No you don't. Have you tried Alan Wake 2 without RT? It's basically indistinguishable from using RT, I'd say.

It's more prominent in Cyberpunk, but even there, all I see right away without pixel-peeping is shiny puddles. Not exactly a revolution in gaming tech.


Good stuff. :) 14k vs 14.5k is already a match, if I dare to say (within a difference undetectable to the naked eye).

In other news: We might get something - a full release, or maybe just some more info, on 22 Jan.
Just dont forget the CPU score.
Unlike speedway, CPU affects final score in timespy.
 
I mean, we still don't really know jack about squat and a lot of the die size rumors regarding N48 appear to have been wrong (lot of leaks guesses of a sub 300mm^2 die).

So whose to say N44 isn't a smidge larger than initially reported guessed at and ends up with a 192-bit bus?

Honestly wouldn't surprise me to see N33 get rebranded and crammed in down at the bottom as a 9050, that card can't cost more than a buck to make at this point.
 
Can't wait to see the 9060. Should be a good card for people that don't spend half their rent or more on GPU's.
Bro, I wish this was half my rent!!
 
I don't like the name change. They should have stuck with 9700XT. Anyhow hopefully the 9070xt performs near or at the 7900XT levels for a decent price e tag.
It's too bad Nvidia is pushing gimmicks like frame generation & charging a high cost for this turd.
 
It clearly says "and an RX 9060 XT with 16GB", then it speculates about a 192-bit bus, there's no way around it. You know it's wrong, I know it's wrong, but for the masses, it's just misinformation.

The sentence should have gone like
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 12 GB on a 192-bit interface."
It's ambiguous. I'm not defending that and I hate ambiguous wording but I'm just the messenger, It's their article not mine. If they hadn't inserted the word "now" (bolded below) I'd be inclined to agree with your interpretation of what they wrote, but the word "now" completely changes the meaning, and makes little sense in the sentence otherwise.

"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."

To my understanding of English grammar, the ",but" and "now" are two strong indicators that the part of the sentence talking about a wider, 192-bit interface are contradictory to the first part of the sentence talking about the 128-bit bus.

I agree that it should be worded more clearly but DigitalTrends aren't clueless, I'm fairly certain they know that you can't have 16GB on a 192-bit bus based on the level of technical depth some of their other articles have gone into, so I'm willing to chalk this one up to ambiguous wording rather than technical ignorance.

Anyway, it is what it is. Hopefully the smaller Navi44 die is a 192-bit product, because that would make the entry level an absolute banger for 2025 :)
 
It would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
 
It would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
Sounds like you have a defective GPU really.

I have both Nvidia and AMD at home, and build mostly Nvidia machines at work these days and it's not any better on the Nvidia side. Faulty hardware is faulty, doesnt't matter what brand logo is on the hardware.
 
Sounds like you have a defective GPU really.
Or insufficient power delivery (I don’t mean PSU Watt only), or some other system issue…

0 issues with the 7900XTX for more than a year now.
 
Replaced twice! Same issues.
 
1200 watt psu.Replaced from an older 1200 watt.
 
1736479758346-png.379332

If you check AMD slide, it places the top RX 9060 model (XT) at performance higher than RX 7700XT which it means either it will be a cutdown Navi 48 variant with 192bit bit bus,
or the other option would be Navi 44 with 64RBs and 2560 shaders that will probably match/exceed slightly 7700XT in FHD (and maybe match it at QHD) if it has high enough clocks (at least 3040MHz turbo for the reference model) paired with 128bit bus and 16GB memory (at this performance level - and price range accordingly - 8GB would be a detriment).
If the package size of Navi44 is just 29 x 29 mm we are probably talking about max 162mm2 die size so nearly impossible to house a 192bit bus design at these dimensions.
 
Last edited:
Replaced twice! Same issues.
Then its not the card, unless you are the most unlucky buyer of the(that) year

1200 watt psu.Replaced from an older 1200 watt.
Watt dont say the whole story unfortunately. Although the 1.2KW sounds enough for 2x 7900XTX to be honest
Some times other factors of PSUs play a role too.

What variant of 7900XTX was?

After 2x GPU replacements, that equal 3 GPUs (assuming different ones and not the same send back) I would be convinced that the GPU(s) had issues (by design) only if tested on different PC and issue remains.
Sorry but I'm dealing with PCs for 25 years and had too many GPUs, and these kind of issues could be anything.
 
Then its not the card, unless you are the most unlucky buyer of the(that) year
Watt dont say the whole story unfortunately. Although the 1.2KW sounds enough for 2x 7900XTX to be honest
Some times other factors of PSUs play a role too.
What variant of 7900XTX was?
After 2x GPU replacements, that equal 3 GPUs (assuming different ones and not the same send back) I would be convinced that the GPU(s) had issues (by design) only if tested on different PC and issue remains.
Sorry but I'm dealing with PCs for 25 years and had too many GPUs, and these kind of issues could be anything.

Exactly, it is not the card. It is the drivers.

1.) Two different rigs tested with different X670E motherboards.
2.) DDU and AMD Clean Utility.
3.) Windows 10 and 11 tested
4.) Memory tested with no overclocked.
5.) BIOS updated.
6.) 12 months of AMD drivers tested.
7.) Two power supplies. With new cables tested.
8.) Multiple replaced GPUs.
9.) If you google around the net there are many of reported cases of TDRs not being fixed at all. This is an AMD driver issue with certain games and it will never be resolved.
 
Where are the RTX 5090 and RTX 4090 in this slide?
It places RX 9070 series around RTX 5050 - RTX 5060 performance, while RX 9060 series around RTX 5030 - 5010 performance levels !
Probably AMD with this slide it places what models the competition offers in relation with their products based on price, so it doesn't correspond to performance but to price regarding Nvidia models
 
Exactly, it is not the card. It is the drivers.

1.) Two different rigs tested with different X670E motherboards.
2.) DDU and AMD Clean Utility.
3.) Windows 10 and 11 tested
4.) Memory tested with no overclocked.
5.) BIOS updated.
6.) 12 months of AMD drivers tested.
7.) Two power supplies. With new cables tested.
8.) Multiple replaced GPUs.
9.) If you google around the net there are many of reported cases of TDRs not being fixed at all. This is an AMD driver issue with certain games and it will never be resolved.
Yes those bad drivers... always.
Any news on what model of 7900XTX was that?
And in what games was the drivers "crashing"?

It would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
BTW when you want to reduce clocks you just limit power from -1% up to -10% from adrenalin, and when that is not enough for whatever reason then you reduce clock limit.
Its common practice among AMD GPU users when they want to reduce clocks/power (for whatever reason)

Starting to doubt about if any of this is real. Did you just joined TPU today to make your anti-AMD statement?
You cant help getting that idea when you dont have a single issue with your games, even the most demanding ones for over a year now...
 
This thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
yeah, seams the case, i was at the pc store the other day and random talk with the tech/returns guy about it led to another random guy jumping in to say how great ray tracing was on his 4700 playing portal...
 
You're right, it IS beyond tiresome how AMD, now on it's 3rd RT generation, cannot meaningfully improve their RT performance, to the point that half a dozen Nvidia cards place above it.
It is not that easy. Almost all "RT" effects implemented in games now are done so via Cuda. Since Cuda is propriatary, its not just a matter of adding more accelerators they also have to figure out how to make them work correctly too. It looks like there may be a "bypass"/"workaround" soon that should narrow the RT gap if not close it, though.

Im actually not a fan of the way RT is used most of the time, though. Its usually just slathered over everything, making old dry dusty concrete/brick look like it was sprayed with glossy epoxy and buffed to a high shine finish. Piles of dirt should not prob not reflect light like it was hosed down with baby oil, ya know? Im also pretty bummed that it is pretty much the only type of advanced lighting effects used anymore. I would like to see slmething else or at least see it used more intentionally and not just applied to all things reflective or not. I get the feeling that the people that spend the most on PC hardware(or have their parents do it), are less worried about playing a good looking, interesting game, than they are worried about saying the names of the parts they have and the numbers they can get. I like hot rodding PCs, too, but just going out and buying a ferrari is not "hot rodding" imo. Getting an older system to perform mucn better than it should is what I enjoy doing. Althouh, since all games are pretty much developed for XB/PS and constrained by their many limitations, I see little point in blowing a ton of money on a PC anymore....for a while actually. The best a PC can do is look better, bug considering it takes everything a PS/XB has just to look as good as it can, there is almost nothing left for much else. That is why basicly the same games jusg keep getting regurgitated with only slightly better graphics. Their scope/scale and depth stays the same...if it is not just reduced(often happens towards the end of a gen as they try to provide better visuals compared to previous games which requires them to dumb the underlying game down even more to do so. Thats why PC games from like 18-20 years ago have better enemy/npc AI and so many more options than new games. I dont care if people want to play games on a console, but since MS and Sony demand that the PC version of a game is the same as their console version other than visuals, they are limited by things that would not be a problem on PC. STALKER 2 is a perfect example. The A-Life (AI and offline NPC tracking system) is not anything like it should be. The spawing and de spawning is almsot as bad as Far Cry 5! Its not because they cant do it....they did it almost 20 years ago, the reason is that xbox can not run that in the background. When my PC is maxed out with the visuals, I still have like 8+ threads and like 15GB of RAM sping nothing. Xbox uses EVERY last bit of power to just run the game at med-high setting at 30fps. They would have to really lower the visual quailty to get it to work and since an Xbox cant do it, they cant let my PC do it. Modders will prob have to fix it since the licsence terms prevent GSC Game World from doing so. Its all so lame.
end /commrant/digressed\;
 
Back
Top