• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Upgrade 6600xt ti 9060xt 16gb

I come from the 6600xt, am I right to take the 9060xt 16gb is it a good upgrade? to be placed alongside the 5800x3d
It's a 60% performance bump. It's not bad - definitely noticeable, but it's not world altering either. I'd rather suggest you save up more and go for the 5070 - or if you wanna stick with amd the 9070 when it comes down in price. It will be a much more noticeable performance upgrade. Going by pure value, a 9070 gives you 2.5 times the performance of your 6600xt vs just an extra 0.6 for the 9060xt.
 
Last edited:
It's a 60% performance bump. It's not bad - definitely noticeable, but it's not world altering either. I'd rather suggest you save up more and go for the 5070 - or if you wanna stick with amd the 9070 when it comes down in price. It will be a much more noticeable performance upgrade. Going by pure value, a 9070 gives you 2.5 times the performance of your 6600xt vs just 0.6 for the 9060xt.

Especially if the 9070 hits msrp at some point...
 
too much money in return for a very low bump in performance, don't do it.
 
Is 56 % a "very low bump"?

Yes, cause you essentialy paying the full 350$ and you get the 60% of a 6600xts performance.

Or in other terms, it takes you from 50 fps to 80, so you are paying 350$ for 30 fps.
 
Which means they are going from sub optimal performance to optimal performance, while at the same time VASTLY improving their ability to run raytracing. You were saying Mr ngeeedia?
 
Nice gaslighting.
It's not gaslighting, it's me genuinely willing to inform the person it's not okay to shell out for a double-digit performance boost, provided they're just playing video games. Higher end GPUs aren't as castrated (even in relative numbers which was unheard of till lately) so they will get both more improvements and a much longer time period without a need for upgrade.

The Cyberpunk 2077 game scales very well. 9060 XT, 380 Euro, 34 FPS:
1749216654551.png


And the 9070 which costs around 630 Euro (+65%) provides with 23 more FPS (+67%):
1749216734547.png


By stretching the budget by 250 Euro, the OP will go from +60% to +150% performance. From "okay, it's become more playable" to "whoa, it's mind blowing!"
 
Which means they are going from sub optimal performance to optimal performance, while at the same time VASTLY improving their ability to run raytracing. You were saying Mr ngeeedia?
I was saying that paying 629€ for 75 extra fps is better than paying 369€ for 30 extra fps. In the first case you pay 8.5€ per fps, in the latter you pay 12€. The value just isn't there in low end gpus.
 
I was saying that paying 629€ for 75 extra fps is better than paying 369€ for 30 extra fps. In the first case you pay 8.5€ per fps, in the latter you pay 12€. The value just isn't there in low end gpus.


this feels like "the more you buy the more you save" :roll:

Both are awful propositions, but for someone on a budget it's much worse. I assume if you can only afford to spend 300 you're in a tight budget, and it makes it so much of a crime to do it. If you have a bigger budget you can probably spend it and it doesn't make such an impact to personal finances.
 
I was saying that paying 629€ for 75 extra fps is better than paying 369€ for 30 extra fps.
That's a valid point, but also an opinion. You don't know where the OP is at. Maybe 630euro is than they can afford right now. That's the world we're in ATM. So yes, the 9060XT is a very good upgrade from 6600. Full fricken stop.

It's not gaslighting, it's me genuinely willing to inform the person it's not okay to shell out for a double-digit performance boost, provided they're just playing video games. Higher end GPUs aren't as castrated (even in relative numbers which was unheard of till lately) so they will get both more improvements and a much longer time period without a need for upgrade.

The Cyberpunk 2077 game scales very well. 9060 XT, 380 Euro, 34 FPS:
View attachment 402781

And the 9070 which costs around 630 Euro (+65%) provides with 23 more FPS (+67%):
View attachment 402782

By stretching the budget by 250 Euro, the OP will go from +60% to +150% performance. From "okay, it's become more playable" to "whoa, it's mind blowing!"
That's 4k. Most people are NOT running 4k. Most people(55%) are on 1080p with only about 20% on 1440p. 4k? Less than 5%. Hell, there's nearly 3% running 768p.
SteamHWS-May25.jpg

So yeah, if we're going to quote numbers, let's keep it realistic and in line with the majority.
 
Last edited:
That's a valid point, but also an opinion. You don't know where the OP is at. Maybe 630euro is than they can afford right now. That's the world we're in ATM. So yes, the 9060XT is a very good upgrade from 6600. Full fricken stop.

I think some don't realize spending 70% more money isn't a possibility for some.

I'm actually in the process of doing a build where the person couldn't spend a dime over 400 on a gpu and didn't want an AMD gpu although amusingly they wanted a ryzen cpu, this took them 8 months to save up for and they didn't want to wait any longer not even a few days for the 9060XT reviews...

So while the mid tier cards are a better overall value from a longevity standpoint in this messed up 2025 gpu world For some they are not an option that doesn't want to wait another 6-12 months to save up even more money.

Even if all these cards were exactly MSRP that's still a 50% bump in price.

The 9060XT 16G is the best card under 400 usd it doesn't have some of the downsides of the nvidia options and if AMD can just get FSR4 into more games it would be a pretty well rounded product if a bit meh from a generational standpoint.

I wish the person I'm doing the build for would have went with it instead even though technically it would have costed more due to a bundle deal I got for them with the 5060ti 8G which ended up close to what I could source a 5060 for.

Although some of that is just me wanting to mess around with a RDNA4 card lol this is my third Blackwell card I've had hands on with and it's way more work giving the customer a settings guide with becuase it's right on the edge in some games and games being so variable just play time and game area can make things worse than what I'm observing. I'm kinda contemplating having them just use medium settings at 1080p just to be safe although DLSS quality at 1080p seems to work great in almost everything, last thing I need is for them to have a stuttering mess and not understand that vram is the issue.
 
Exactly. It's a thing and a lot of people are in that boat. It's sucks and there's no shame in it. Everyone need to keep that in mind, but a lot of people are not.

I spend a lot of time on lower tier hardware drop some settings don't use RT, gameplay is still the same and I'd still take it over a console especially with even them being expensive in 2025.

Even 40-50 fps with VRR as long as there is no major stutters can be fine in a lot of games in shooters just use a combination of Medium/High and possibly some upscaling no problem.

1080p is still fine at 24 inches and with Transformer and FSR4 upscaling is now at least usable at that resolution.

I get both sides and as enthusiasts nothing wrong with explaining why a 9070/5070 could be a better option if they can afford it, it's like having the cherry on top a little more legs and being able to add some extra eye candy on top super nice but not a requirement to enjoy a game.
 
That's 4k. Most people are NOT running 4k. Most people(55%) are on 1080p with only about 20% on 1440p. 4k? Less than 5%. Hell, there's nearly 3% running 768p.
You fail to see the point.

The point of this 4K comparison is to show how the games behave when the only limiting factor is the GPU. Upcoming games will be as heavy at 1080p as Cyberpunk is at 2160p. Not all of them but some of them will.

That's why I show the 4K results from the game I am perfectly sure is 100% GPU bound.
 
You fail to see the point.

The point of this 4K comparison is to show how the games behave when the only limiting factor is the GPU. Upcoming games will be as heavy at 1080p as Cyberpunk is at 2160p. Not all of them but some of them will.

That's why I show the 4K results from the game I am perfectly sure is 100% GPU bound.

While that can be a good indication of future game performance the 4060 gets 38% more performance at 1440p than it did at launch at 4k and I would argue it was a 1080p card even at launch. If you're talking two generations sure in the span of 4-5 years you'll likely drop 40-50% but at that point it's time to upgrade again. Games tend to get 10-15% more demanding every year but outside of vram being the limiting factor most cards scale very similarly.

The 4060 has dropped 24% in performance over the last generation at 1440p going from 69 to 52 fps just as as an example and some of that is likely due to the vram. This isnt different for any card though.

The 4080 has dropped 22% from it's launch going from 177 to 134 fps in that same span at 1440p

Launch reviews vs 5060 review results.


I mean the one thing that will never change is people should get the best card within their budget/means, if doubling your budget was so easy the majority of gamers wouldn't be gaming on 60 class hardware.

The next step up cost 60-70% more money it's up to the individual person and their disposable income to decide if that's worth it not people on forums not spending the money out of their own pockets to decide.

I've built with the 6600xt multiple times and I would take a meh AF 5060ti 8GB over it any day of the week, I'm slightly higher on the 9060XT to me anything over 40% is ok keep in mind we are lucky to even get that over 2 generations at the low end.

There are also other factors besides raw performance the doubling a vram, the vastly superior upscaler, and the much better performance in games like Doom DA that requires RT.

Still there is nothing wrong recommending a 550-650 usd gpu but telling someone they shouldn't upgrade at all if they can't afford one is ridiculous.
 
the 4060 gets 38% more performance at 1440p than it did at launch at 4k
I'm talking a 4 y.o. game here. And upcoming, not current games. So, almost 3 generations. This game was heavy and required a 330+ USD GPU to run smoothly at 1080p Ultra (no RT) back then. Today, with some DLSS, you can do that on a 300 dollar GPU at 4K (only diehards don't enable DLSS on their RTX GPUs at 4K). Something similar is happening at 1440p in recent titles. Upcoming ones should be as demanding at 1080p.

My point is the current xx60 class is an absolute shitshow; maaaaaaaybe 16-gigabyte versions are kinda okay if heavily overclocked but 8 GB ones are definitely to rot on shelves till heavily discounted. One need to be mega desperate to buy them. Hence my recommendation to save more money or wait for discounts, whatever comes first, if possible.
 
One thing is if the OP had not gpu, then sure buy one, but as he already has a working gpu, i would never say buy such a crappy upgrade.
The more you legitimise what AMD and Nvidia is doing the worse things will get.
 
I'm talking a 4 y.o. game here. And upcoming, not current games. So, almost 3 generations. This game was heavy and required a 330+ USD GPU to run smoothly at 1080p Ultra (no RT) back then. Today, with some DLSS, you can do that on a 300 dollar GPU at 4K (only diehards don't enable DLSS on their RTX GPUs at 4K). Something similar is happening at 1440p in recent titles. Upcoming ones should be as demanding at 1080p.

My point is the current xx60 class is an absolute shitshow; maaaaaaaybe 16-gigabyte versions are kinda okay if heavily overclocked but 8 GB ones are definitely to rot on shelves till heavily discounted. One need to be mega desperate to buy them. Hence my recommendation to save more money or wait for discounts, whatever comes first, if possible.

The whole stack is a shit show we have a 5070 that's barely better than a 4070 super a 5070ti that couldn't even conclusively beat a 4080 and a 5080 that got one of the worst uplift ever and couldn't even beat the previous generations flagship that's just 2025 in a nutshell

I look at it this way

400 or less I'd buy the 9060XT 16GB if the 4060ti 16 was msrp I'd lean that way it's not that they are good they are just the cheapest options that don't struggle in some games at 1440p

550-650 I'd buy the 9070, if the 5070 is 70 cheaper I'd get that and just live with it being a 12GB card.

700-1000 I'd buy the 5070ti, if the 9070XT was 699 or cheaper for a good model I'd buy that assuming the 5070ti was 150 or more expensive.

1000 or more Nvidia no competition just comes down to if you want to spend over 1200 or over 2000 closer to 3000 in the states.

That's just me though and there really isn't a wrong choice becuase everyone games differently some who is ok with medium settings at 1080p might be ok with the 299 options, but someone who games at 4k and wants to use Pathtracing has 1 option and it will destroy your wallet and that is ok.

The sweet spot to me is the 9070xt/5070ti performance wise but they come with comically bad pricing.

Still if I couldn't spend a dime over 400 a 9060XT 16G at 1080p/1440p would be fine.

One thing is if the OP had not gpu, then sure buy one, but as he already has a working gpu, i would never say buy such a crappy upgrade.
The more you legitimise what AMD and Nvidia is doing the worse things will get.

Buying any card this generation is legitimizing shit being shoveled at us but what are gamers suppose to do wait another 2 years for a likely equally crappy generation.
 
likely equally crappy generation.
Unlikely. It's gonna be even worse. At this rate, I'm not gonna be surprised if they make an 8 GB RTX 6060 for more than 300 bucks.
 
Unlikely. It's gonna be even worse. At this rate, I'm not gonna be surprised if they make an 8 GB RTX 6060 for more than 300 bucks.

They wouldnt have to change much and could make it 12GB gpu, the 3gb GDDR7 should be in good supply by then although they could also cut it down even more and make it a 9GB gpu lmao

Wouldn't be surprised with either being the road they take given how things are going.
 
the 3gb GDDR7 should be in good supply by then although they could also cut it down even more and make it a 9GB gpu lmao
Or use leftover 1-GB GDDR6 chips, ultimately making an 8 GB sammich. And yeah, for some reason, TDP will exceed 200 W and it'll have an even more horrible power connector.
 
Back
Top