• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RX 9060 XT 16 GB GPU Synthetic Benchmarks Leak

I am appalled by the level of sheer ignorance being displayed by people who are very clearly intelligent enough to work out the facts of reality... What the hell people, you're smarter than this! Good fricken grief.. :rolleyes:
It's a relative performance graph. Relative to the 3080 the 2080 is 40% slower. Relative to the 2080 the 3080 is 66.7% faster. Saying the 3080 is 40% faster than the 2080 is incorrect.

Screenshot From 2025-06-01 06-11-05.png


This is the avg fps at 4k from the 3080 review. Relative to the 2080 the 3080 is 67% faster. Relative to the 3080 the 2080 is 60% of the performance or 40% slower. I'm trying to be polite here despite your rudeness because I actually misread a relative performance graph exactly as you did on another forum, and was corrected by another poster.
 
RTX 3080 was a decent gpu in comparison to RTX 2080 or RTX 4080 piece of s*hit.

RTX 2080 poor performance gen to gen and poor p/p in general.
RTX 4080 good performance but terrible awful price and even worse p/p ratio than RTX 2080
RTX 3080 was a good performer and at the time used was going for around 500€-600€ (Of course it was not on launch) The price of RTX 30 Series was down faster than on RTX 40 Series which holds its value forever.
 
Last edited:
FSR 1 to 3.1 were all software based. FSR 4 is running on the AI units of the card, so hardware based. Those AI units on the previous generation RDNA cards are super slow (hence the trash RT performance). That's why the 9070 XT is much better in RT, even though it has half the AI cores.
I believe there's no official explanation but this is my semi-educated take on things.
Do you know how Intel gets XESS to work on other cards when it's also AI trained? To get the best version you still need an ARC card but it seems to at least have a more stable image than FSR 3.

As for the actual topic, 66% of the 9070 sounds about what I expected.

It was always going to match the 5060 Ti or be faster, which is already a 6800 in raster.
 
7700xt but with fsr4, amd selling features as new generation of cards, now where did we see that before ........
 
Last edited:

- Yeah, looks about right.

Not sure what people were expecting, this is an N44 die absolutely juiced to the gills. Broader tests with fewer/lighter RT games will probably have it neck and neck with the 5060Ti.

So long as it stays relatively close to it's MSRP, it's a solid deal for this gen. If its price floats more than $50 up it's DOA.
 
I can't. Im dying. Amd does it again...

The 8gb 9060xt will be matching the 3070ti :D
4% better average and 17% better 1% lows at 1080p. Some pretty bad results for 3070 Ti at 1080p in Alan Wake 2 and Cyberpunk 2077. Comparatively horrendous 1% lows in 4/5 individual games shown. The 3070 Ti isn't matching the 9060 XT if it gives you a slighly lower average but is a stuttering mess due to awful frame pacing.
RX9060XT-LEAK-2.jpg

I don't know how trustworthy eTeknix testing methodology is anyway. I hadn't heard of him before today. Dude is trying to be discount Hardware Unboxed to the point of hilarity. From the slides, to the background set, to the presentation/editing style, to the b-roll, to the thumbnails, muh guy is trying to copy HUB's style exactly. It's actually kind of creepy.
 
If those are ballpark performance it's fine at 350-370 usd a whole 100 less than what the 5060ti 16G is going for and at least you can use this card at 1440p and not have to worry as much about conslow ports. In Reality I am expecting 400-420 usd after the first week so meh but it will likely sell because everything else is trash at the 500 and under price point.


Anyone who want's one at a good price better pull the trigger at launch lmao.
 
4% better average and 17% better 1% lows at 1080p. Some pretty bad results for 3070 Ti at 1080p in Alan Wake 2 and Cyberpunk 2077. Comparatively horrendous 1% lows in 4/5 individual games shown. The 3070 Ti isn't matching the 9060 XT if it gives you a slighly lower average but is a stuttering mess due to awful frame pacing.

I don't know how trustworthy eTeknix testing methodology is anyway. I hadn't heard of him before today. Dude is trying to be discount Hardware Unboxed to the point of hilarity. From the slides, to the background set, to the presentation/editing style, to the b-roll, to the thumbnails, muh guy is trying to copy HUB's style exactly. It's actually kind of creepy.
My guess is pretty on at 1080p, but would expect 5060ti 16gb to take a bit of a bigger lead at 1440p as it comes down to 5060ti having 30% more memory bandwidth of 448gb vs the 9060xt's 322gb/s. Remember the slides amd put out tested 5060ti 8gb vs their card at 1440p ultra.
 
La mia ipotesi è piuttosto precisa a 1080p, ma mi aspetto che la 5060ti da 16 GB prenda un vantaggio maggiore a 1440p, dato che la 5060ti ha una larghezza di banda di memoria del 30% superiore, ovvero 448 GB, rispetto ai 322 GB/s della 9060XT. Ricordate le slide pubblicate da AMD che hanno testato la 5060ti da 8 GB rispetto alla loro scheda a 1440p Ultra?
dovrei aspettare il 9060xt o prendere un 7900gre usato per 430 €?
 
My guess is pretty on at 1080p, but would expect 5060ti 16gb to take a bit of a bigger lead at 1440p as it comes down to 5060ti having 30% more memory bandwidth of 448gb vs the 9060xt's 322gb/s. Remember the slides amd put out tested 5060ti 8gb vs their card at 1440p ultra.
This seems like a pretty RT heavy suite. In pure raster I wouldn't be all that surprised if the 9060 XT slightly beats the 5060 Ti 16 GB at 1080p and possibly even 1440p. I was assuming AMD used the 8 GB card to make the RT performance look better. Not many games struggle at 1440p ultra with 8 GB of VRAM until you turn on RT. It probably didn't help AMD's slides on the games without RT very much. I could definitely see The 5060 Ti 16 GB having an advantage at 1440P with RT or at 2160p ultra. The memory bandwidth advantage will probably help them in those scenarios. Not much point in speculating at this point though. We'll know in a couple of days when we can see some quality reviews.
 
This seems like a pretty RT heavy suite. In pure raster I wouldn't be all that surprised if the 9060 XT slightly beats the 5060 Ti 16 GB at 1080p and possibly even 1440p. I was assuming AMD used the 8 GB card to make the RT performance look better. Not many games struggle at 1440p ultra with 8 GB of VRAM until you turn on RT. It probably didn't help AMD's slides on the games without RT very much. I could definitely see The 5060 Ti 16 GB having an advantage at 1440P with RT or at 2160p ultra. The memory bandwidth advantage will probably help them in those scenarios. Not much point in speculating at this point though. We'll know in a couple of days when we can see some quality reviews.
If you look at the slide they had it showing 1 side showing ultra settings of game then other side showing game with RT on. Fact that 40 games they are 6% faster when only around 7-8 listed are faster what about the other 33-34 games?
1748916782716.png


dovrei aspettare il 9060xt o prendere un 7900gre usato per 430 €?
Depending on what is more important to you. 7900GRE is about 20-25% faster that is rough guess based upon 5060ti performance. 9060xt will do better with RT turned on and has access to FSR4 which 7900GRE is weaker at RT and doesn't have hw for FSR4.
 
Last edited:
4% better average and 17% better 1% lows at 1080p. Some pretty bad results for 3070 Ti at 1080p in Alan Wake 2 and Cyberpunk 2077. Comparatively horrendous 1% lows in 4/5 individual games shown. The 3070 Ti isn't matching the 9060 XT if it gives you a slighly lower average but is a stuttering mess due to awful frame pacing.
You can't be seriously happy about that, can you? It's a 5 year old card man. The 8gb 9060xt will be barely above the regular 3070...
 
You can't be seriously happy about that, can you? It's a 5 year old card man. The 8gb 9060xt will be barely above the regular 3070...
Am I happy about it? No, and I'm not sad or mad about it either. It is what it is. There's nothing we can do about but refuse to buy the products at these inflated prices. We're not getting the gen on gen performance improvement we used to for multiple reasons. If you want to look at the value of old cards and get frothed at the mouth about the lack of value we're getting 4-5 years later be my guest. This isn't specific to AMD so I'm not sure why you are bringing this up as if it is.

It's a 4 year old card, not a 5 year old card. A 4 year old card with a $600 2021 MSRP, 2x the die size, 2x the memory bandwidth, and 2x the power budget. The 3060 is just as old and still being produced and sold for $300. The 5060 is slower than a 3070 and also has a $300 MSRP. The 5060 Ti 8 GB is slower than the 3070 in some instances and has a $380 MSRP. Any falloff vs older cards that the 9060 XT 8 GB has against the 16 GB will also hold true for the 5060 Ti variants as well. How about we wait for some quality reviews and see how the performance and street pricing pans out. At that point we can try to have an intelligent discussion about the value these 9060 XT cards represent vs the other cards that are currently available on the market.
 
Any falloff vs older cards that the 9060 XT 8 GB has against the 16 GB will also hold true for the 5060 Ti variants as well. How about we wait for some quality reviews and see how the performance and street pricing pans out. At that point we can try to have an intelligent discussion about the value these 9060 XT cards represent vs the other cards that are currently available on the market.
Well of course it applies to the 5060ti variants as well but it's already well established and documented that nvidia is a big greedy evil corpo that eats kids for breakfast and ruins PC gaming with stagnation. Surely - that can't be the metric right? :D
 
Back
Top