• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RX 9060 XT 16 GB GPU Synthetic Benchmarks Leak

Uhm, that's the point, it's not. So how does the claim "nvidia fleecing their customers" hold any merit when amd has done the same or worse. Cause I swear to god I've never heard anyone around here claiming left and right on every goddamn thread that amd is fleecing their customers.

Why is that, in your opinion ? :p

EG1. FSR exists for Pascal owners - thanks to nvidia. We both know that FSR wouldnt have existed without nvidia introducing DLSS, right?

David always gets a pass when facing Goliath.

That being said FSR4 is a huge jumps so I believe AMD when they say it isnt possible on older generations. I don't for one second believe a 3080 and above could not have handled a version of frame generation which AMD ended up supplying.

That being said everyone should be buying gpus for what they offer today not what they may or may not offer in 2 years anything we get beyond that should just be looked at as a bonus hats off to Nvidia for offering SR all the way back to Turing now hopefully RDNA4 is supported similarly as long if in a generation or 2 it gets dropped shame on AMD.
 
David always gets a pass when facing Goliath.

That being said FSR4 is a huge jumps so I believe AMD when they say it isnt possible on older generations. I don't for one second believe a 3080 and above could not have handled a version of frame generation which AMD ended up supplying.

That being said everyone should be buying gpus for what they offer today not what they may or may not offer in 2 years anything we get beyond that should just be looked at as a bonus hats off to Nvidia for offering SR all the way back to Turing now hopefully RDNA4 is supported similarly as long if in a generation or 2 it gets dropped shame on AMD.
I don't see nvidia as a goliath. Their market cap has gone through the roof very recently (couple of years) due to AI, not gaming. Before that they were both multi billion dollar companies. Its a goliath in terms of marketshare but that's because, well, they are making better products.
 
I think you misread my post in multiple ways:

1. I didn't mention the 9070XT, I explicitly said 5070.
2. I compared 5070 to 3070. If I'd meant 4070 I'd have typed 4070. The additional clue was in the "50% more VRAM" ie, the 3070's 8GB vs 5070's 12GB.
And what's the point for that? RTX 5070 replaces RTX 4070 not the RTX 3070!!!

By techpowerup numbers RTX 5070 @4k is 57% faster than RTX 3070 (average fps) and 89% faster in RT.

By techpowerup numbers RX 9070 XT @4k is 123% faster than RX 6700 XT (average fps) and 193% faster in RT.

Gen to Gen improvements Nvidia is not even in the same universe when it comes to raw performance AMD just flattens it.
 
Last edited:
And what's the point for that? RTX 5070 replaces RTX 4070 not the RTX 3070!!!

By techpowerup numbers RTX 5070 @4k is 57% faster than RTX 3070 (average fps) and 89% faster in RT.

By techpowerup numbers RX 9070 XT @4k is 123% faster than RX 6700 XT (average fps) and 193% faster in RT.

Gen to Gen Nvidia is not even in the same universe when it comes to raw performance AMD just flattens it.
It's like you don't even read what you're replying to, or what you wrote that prompted a reply in the first place.
I give up, this is like talking to random phrase generator.
 
And what's the point for that? RTX 5070 replaces RTX 4070 not the RTX 3070!!!

By techpowerup numbers RTX 5070 @4k is 57% faster than RTX 3070 (average fps) and 89% faster in RT.

By techpowerup numbers RX 9070 XT @4k is 123% faster than RX 6700 XT (average fps) and 193% faster in RT.

Gen to Gen improvements Nvidia is not even in the same universe when it comes to raw performance AMD just flattens it.
Uhm, what the hell?

By TPU numbers rtx 5090 @ 4k is 500% faster than rtx 4060

By TPU numbers 9070xt @ 4k is 5% SLOWER than the 7900xtx.

Gen to gen improvements amd is not even in the same universe when it comes to raw performance, nvidia just flattens it.

Your comparison is just whack. Is that on purpose or....??
 
The 5070 is 50% faster than the 3070 in raster, 100% faster in RT, has 50% more VRAM, and it comes at a significantly lower MSRP when you adjust for inflation.
RTX 3070 has way bigger die size 392mm² vs 263mm² RTX 5070 inflation will not help because RTX 3070 is one tier above the RTX 5070 not in the same class. Even RTX 3060 Ti is a higher tier gpu than RTX 5070.

RTX 3070 is a lower tier high end or something like that and RTX 5070 is just a midrange GTX 1060 6GB/RTX 3060 12GB type gpu. Nvidia quietly downgraded RTX 40 and 50 series gpus one tier lower.
 
Last edited:
I don't see nvidia as a goliath. Their market cap has gone through the roof very recently (couple of years) due to AI, not gaming. Before that they were both multi billion dollar companies. Its a goliath in terms of marketshare but that's because, well, they are making better products.

2014 the last time AMD had decent market share Nvidia was still 5x bigger.

You could also argue AMD is only where it is today due to the CPU portion of their business same with Nvidia and their AI BS.

AMD gpu division would still be considered an underdog to Nvidia and it has been for a long time even when market share wasn't 9/1
 
Last edited:
I think you misread my post in multiple ways:

1. I didn't mention the 9070XT, I explicitly said 5070.
2. I compared 5070 to 3070. If I'd meant 4070 I'd have typed 4070. The additional clue was in the "50% more VRAM" ie, the 3070's 8GB vs 5070's 12GB.
Yes it has 50% more VRAM but that still is not changing nvidia's poor performance improvements gen to gen. You cannot escape from obvious when facts are in front of you.
 
Don't use die size to compare GPUs, use performance results in real scenarios. All this talk of cut-down silicon and stuff like "a 5070 is really a 5060 because of the 192-bit bus, the xx60 cards were 192-bit for generations" is dumb. The 5070 is 50% faster than the 3070 in raster, 100% faster in RT, has 50% more VRAM, and it comes at a significantly lower MSRP when you adjust for inflation.

There is no such thing as an inflation. You get cheaper products each generation because of the costs optimisations, not more expensive products.
50% is normally the "generation-to-generation" performance uplift and doesn't warrant a straight-away purchase decision - more likely DOA because it is not enough.
And yes, 5070 is a fake number for what should be a 5050 or 5050 Ti . .
 
There is no such thing as an inflation. You get cheaper products each generation because of the costs optimisations, not more expensive products.
50% is normally the "generation-to-generation" performance uplift and doesn't warrant a straight-away purchase decision - more likely DOA because it is not enough.
And yes, 5070 is a fake number for what should be a 5050 or 5050 Ti . .

No matter how many times you repeat this, it will continue to be a false statement based on nothing but blind belief (wishful thinking) and not reality
 
No matter how many times you repeat this, it will continue to be a false statement based on nothing but blind belief (wishful thinking) and not reality

Why do you always speak about things which you clearly don't understand?

RTX 2060 was a 445 sq. mm chip, RTX 2080 Ti was a 754 sq. mm chip. So, the RTX 2060 was around 59% of RTX 2080 Ti.
Now look at the crap "RTX 5070"- 263 sq. mm, when RTX 5090 is 750 sq. mm. or 35%.

GTX 950 was a 228 sq. mm chip, the fake "RTX 5070" is 263 sq. mm.

I will tell you one thing - listen to the smarter and better than you. And from now on - you go to the ignore sector where you belong . .
 
No matter how many times you repeat this, it will continue to be a false statement based on nothing but blind belief (wishful thinking) and not reality
Well, turns out that the 9060xt might end up being a great product. Lots of vram (for the 16gb version), better colors than nvidia, doesn't need always online for upscaling to work and it's going to wall all over the 5070.
 
Then we should be expecting a huge beatdown, the 9600xt should smack the lights out of the 5050 (5070). We will find out in a few days I guess.
We all now that RTX 5070 is faster than RX 9600 XT please don't be silly. That's like 2+2 = 4
 
We all now that RTX 5070 is faster than RX 9600 XT please don't be silly. That's like 2+2 = 4
How do we know that? If the 5070 is a fake name and is instead a 5050 (that was the claim my man) why the heck would we not expect the 9060xt to beat it?
 
How do we know that? If the 5070 is a fake name and is instead a 5050 (that was the claim my man) why the heck would we not expect the 9060xt to beat it?
Yes nvidia are cut down but it's faster at the same hardware level so what we are talking about ? RX 9060 XT is a low end 128 bit gpu.
 
Yes nvidia are cut down but it's faster at the same hardware level so what we are talking about ? RX 9060 XT is a low end 128 bit gpu.
So the 9060xt won't be beating the 5050? Okay, I still have my hopes up but we will see next week.
 
Why do you always speak about things which you clearly don't understand?

I will tell you one thing - listen to the smarter and better than you.
I just want to be sure here..... You're asserting that you're smarter and better than someone because you believe a product belongs in a different naming and price tier because of the single metric you picked out, ignoring all other metrics like market conditions, bill of materials, production costs etc, and that this is factually irrefutable, and if anyone disagrees that makes them dumber amd worse than you?
 
I just want to be sure here..... You're asserting that you're smarter and better than someone because you believe a product belongs in a different naming and price tier because of the single metric you picked out, ignoring all other metrics like market conditions, bill of materials, production costs etc, and that this is factually irrefutable, and if anyone disagrees that makes them dumber amd worse than you?

Bruh, this is the dude trying to use a program made by W1z to debunk W1z power numbers in his reviews...
 
There is no such thing as an inflation. You get cheaper products each generation because of the costs optimisations, not more expensive products.
50% is normally the "generation-to-generation" performance uplift and doesn't warrant a straight-away purchase decision - more likely DOA because it is not enough.
And yes, 5070 is a fake number for what should be a 5050 or 5050 Ti . .
Arf, we've had only one 50% generational performance uplift in the last decade, maybe even the last two decades - and that was Pascal in the GTX 10-series. That 50% uplift was a combination of Pascal being really very good for gaming, and Maxwell having been around for a long while, having been introduced in the 700-series as the GTX 750Ti. The 900-series Maxwell cards were decent, but perhaps a little below par in terms of generational progress over Kepler, so the longer-than-usual delay between Maxwell and Pascal, coupled with the slightly underwhelming Maxwell as a starting point to base your generational uplift on certainly helps to explain why that 50% increase happened then, and not again since....

If I've forgotten another significant generational uplift, then please feel free to correct me, but uplifts of more than 30% are pretty rare in the industry these days, and 50% is almost a pipe dream at this point.
 
Why do you always speak about things which you clearly don't understand?

RTX 2060 was a 445 sq. mm chip, RTX 2080 Ti was a 754 sq. mm chip. So, the RTX 2060 was around 59% of RTX 2080 Ti.
Now look at the crap "RTX 5070"- 263 sq. mm, when RTX 5090 is 750 sq. mm. or 35%.

GTX 950 was a 228 sq. mm chip, the fake "RTX 5070" is 263 sq. mm.

I will tell you one thing - listen to the smarter and better than you. And from now on - you go to the ignore sector where you belong . .

The ad hominem followed by a room temperature IQ argument will always result in a very classy post. It's almost, just about almost as if the entire point in developing newer process nodes is to increase transistor density enabling the production of more advanced processor designs, making the "muh die size" argument completely irrelevant. And of course, continue to exclude every other factor, as the cost of the components, heatsink, plastics on the hardware as well as the research and development of the chip, the driver development costs, etc. all of which factor into a GPU's final price.

To top it off... you still can buy an older product if it suits your needs. This is hysteria for the purposes of hysteria, and little else.
 
Arf, we've had only one 50% generational performance uplift in the last decade, maybe even the last two decades - and that was Pascal in the GTX 10-series. That 50% uplift was a combination of Pascal being really very good for gaming, and Maxwell having been around for a long while, having been introduced in the 700-series as the GTX 750Ti. The 900-series Maxwell cards were decent, but perhaps a little below par in terms of generational progress over Kepler, so the longer-than-usual delay between Maxwell and Pascal, coupled with the slightly underwhelming Maxwell as a starting point to base your generational uplift on certainly helps to explain why that 50% increase happened then, and not again since....

If I've forgotten another significant generational uplift, then please feel free to correct me, but uplifts of more than 30% are pretty rare in the industry these days, and 50% is almost a pipe dream at this point.
Ampere (2080 ---> 3080) was over 50% as well, especially in 4k.

But all of this is irrelevant, the sad part is that the only company that tries in the GPU space is the one that is the most incapable of delivering a good product (Intel). So we are stuck with 2 bad options, choose your poison.
 
Ampere (2080 ---> 3080) was over 50% as well, especially in 4k.

But all of this is irrelevant, the sad part is that the only company that tries in the GPU space is the one that is the most incapable of delivering a good product (Intel). So we are stuck with 2 bad options, choose your poison.

And Turing, IMHO, is worthy of its namesake: it introduced pretty much all of the modern GPU technologies in use today. Sure it was very expensive, but it was also the pioneer of the modern-day RT, AI-enabled GPU.

Arf, we've had only one 50% generational performance uplift in the last decade, maybe even the last two decades - and that was Pascal in the GTX 10-series. That 50% uplift was a combination of Pascal being really very good for gaming, and Maxwell having been around for a long while, having been introduced in the 700-series as the GTX 750Ti. The 900-series Maxwell cards were decent, but perhaps a little below par in terms of generational progress over Kepler, so the longer-than-usual delay between Maxwell and Pascal, coupled with the slightly underwhelming Maxwell as a starting point to base your generational uplift on certainly helps to explain why that 50% increase happened then, and not again since....

If I've forgotten another significant generational uplift, then please feel free to correct me, but uplifts of more than 30% are pretty rare in the industry these days, and 50% is almost a pipe dream at this point.

Unfortunately, thanks to "GPU shrinkflation" and bumping up SKUs.

The progress is there, but it doesn't reflect in the consumer space. For example, the RTX 4090 is not a full AD102. Nvidia never even released a full AD102, including in the professional space. The same applies to the RTX 5090, it's not a fully enabled processor, with about 88% of the cores and 75% of the cache of a full GB202 chip. The last high-end card released to gaming segment that had a fully enabled processor was the RTX 3090 Ti - and it is worth remembering, 3090 Ti was an early 2022 product built out of a (by then) 2 year old GA102, and widely considered to be a cash grab, due to its very high price and also because Ada was only months away.

It is now 2025, and we're just at ~100% over that product's performance with the RTX 5090 in its cutdown condition, and only now that its performance level was made available at a more affordable segment (the RX 9070 XT is around 5% faster than an RTX 3090 Ti, with a $1400 lower MSRP - not that MSRP is real or matters, just for reference purposes). 5 years since the creation of the GA102 and 3 years on from the 3090 Ti's release.
 
Back
Top