• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Upgrade 6600xt ti 9060xt 16gb

It depends on what you want. It's an upgrade yes, but what is your budget?
 
across a large game test suite, you would be looking at around 35-40% performance upgrade @ 1080p. Is it a good upgrade would depend if the 9600XT gives you the performance you want in the games you play for the money you are willing to spend. The 5800X3D would not hold the card back in terms of performance.
 
Last edited:
At MSRP sure at 400+ nah I'd save up for something better
 
Last edited:
Looked at memex stock on 9060xt here in Canada. Holy crap, it's as expensive as 5060ti 16gb models and all are out if stock.
 
At MSRP sure at 400+ nah I'd save up for somthing better

There’s not anything better until $600 really, whether or not cards exist at msrp in the OPs location is also a crapshoot (worse on faster parts 5070 & above). If the 5060ti 16gb were cheaper it’d be a different story.

$350-370 the 9060XT 16gb is, in my opinion, the best card to get if thats your budget.
 
I'd strongly recommend waiting until the likes of RTX 5070 become affordable for you (either by prices going down or by you becoming richer, or both). GPUs like this have much more oomph to them.
 
Al prezzo consigliato dal produttore, sicuramente a 400+, no, risparmierei per qualcosa di meglio
389€ 9060xt

In un'ampia suite di test di gioco, si otterrebbe un miglioramento delle prestazioni di circa il 35-40% a 1080p. Se si tratti di un buon upgrade dipenderà dal fatto che la 9600XT offra le prestazioni desiderate nei giochi a cui si gioca, in base al prezzo che si è disposti a spendere. La 5800X3D non rappresenterebbe un ostacolo per la scheda in termini di prestazioni.
meglio un 7900gre da 430€ usato?

Al prezzo consigliato dal produttore, sicuramente a 400+, no, risparmierei per qualcosa di meglio
meglio un 7900gre da 430€ usato?
 
389€ 9060xt

That's Passable not great not terrible if it's within 50 usd of the 5060ti 16G I'm buying that instead but otherwise your other new options in that price range are generally worse.

meglio un 7900gre da 430€ usato?

I would never buy used that's just a question you'll have to answer yourself if you're willing to take in the extra risk the 7900GRE is one of the better RDNA3 gpu's but you do lose out on FSR4 which is vastly superior to FSR2/3.
 
If the 5060ti 16gb were cheaper

Even on equal amounts you would likely swing the way of the 9060XT because of its x16 pcie bandwidth which the 5060Ti doesn't have which could potentially hurt performance on his 3.0 pcie platform.

Hopefully W1z can do a pcie scaling review running 3.0 to see exactly what negative effects it will bring to the 5060Ti.

Here is a slight sample.

 
Looked at memex stock on 9060xt here in Canada. Holy crap, it's as expensive as 5060ti 16gb models and all are out if stock.
Best Buy has some Dual Units for $449
 
Even on equal amounts you would likely swing the way of the 9060XT because of its x16 pcie bandwidth which the 5060Ti doesn't have which could potentially hurt performance on his 3.0 pcie platform.

Hopefully W1z can do a pcie scaling review running 3.0 to see exactly what negative effects it will bring to the 5060Ti.

Here is a slight sample.


I could be wrong but I believe that is a mostly 8GB gpu issue becuase the gpu has to dump stuff onto system ram and it's too slow with older pcie. I believe as long as you're not exceeding the frame buffer it's a non issue although you dont really want to do that on any pcie version so while it sucks less on 5.0 it would still suck.
 
I could be wrong but I believe that is a mostly 8GB gpu issue becuase the gpu has to dump stuff onto system ram and it's too slow with older pcie.

I believe they are separate identities and vram should have no bearing on pcie bandwidth. Pcie bandwidth comes from the cpu where the gpu will access it's own vram and spill into system memory if starved.

I'm no electrical engineer but I believe this is how it works. Maybe somebody else with better knowledge than me can intervene.

Found this quote on another forum:

There is a connection yes, but the two are using two different memory pools.

The GPU memory bandwidth is how fast the GPU's connection is to it's VRAM buffer, which is only used for graphics related data like textures, shaders, geometry etcetera..

PCI-e bandwidth measures how fast the GPU connection is to the CPU's PCI-e controller, which controls the GPU's access to system memory or RAM.

The GPU has to copy relevant data for graphics from system RAM into it's VRAM buffer, so that's how the PCI-e speed can affect performance of the GPU
 
I believe they are separate identities and vram should have no bearing on pcie bandwidth. Pcie bandwidth comes from the cpu where the gpu will access it's own vram and spill into system memory if starved.

I'm no electrical engineer but I believe this is how it works. Maybe somebody else with better knowledge than me can intervene.

Found this quote on another forum:

There is a connection yes, but the two are using two different memory pools.

The GPU memory bandwidth is how fast the GPU's connection is to it's VRAM buffer, which is only used for graphics related data like textures, shaders, geometry etcetera..

PCI-e bandwidth measures how fast the GPU connection is to the CPU's PCI-e controller, which controls the GPU's access to system memory or RAM.

The GPU has to copy relevant data for graphics from system RAM into it's VRAM buffer, so that's how the PCI-e speed can affect performance of the GPU

Kinda what I meant it's only an issue if there is not enough Vram and the system has to swap in the DF video I believe it only shows the 5060 vs 9060XT 15GB but in th HUB video The 9060XT 8GB sucks at lower pcie standards regardless of it having an X16 capable slot.


It might suck less but it still sucks

Screenshot (53).png
 
Last edited:
I wouldn’t spend that much for a 1/3rd performance increase. Put aside some money every week and get a holiday deal on something much more potent.

I would def save up and just wait for the 5070/9070 to hit MSRP maybe by then FSR4 will have much better game support.


Easy for me to say though it isn't my $$$
 
Kinda what I meant it's only an issue if there is not enough Vram and the system has to swap in the DF video I believe it only shows the 5060 vs 9060XT 8GB but in th HUB video The 9060XT 8GB sucks at lower pcie standards regardless of it having an X16 capable slot.


It might suck less but it still sucks

View attachment 402699
That's one game, and one that's known to be glitchy. F1 2024 is the same. AFAIK, that's an outlier..
 
That's one game, and one that's known to be glitchy. F1 2024 is the same. AFAIK, that's an outlier..

There were multiple games that behaved that way even with upscaling.


Screenshot (56).pngScreenshot (55).pngScreenshot (54).pngScreenshot (57).png
 
@oxrufiioxo,

I was doing some research on pci-e scaling and from what I can see, I might be flogging a dead horse for the 3.0x16 vs 3.0x8 debate, especially when talking about lower end cards. The results are definitely there for 3.0 x16 vs 8 bandwidths (on a 5090) but it looks to be more game dependent than anything else. Some games scale poorly (shown below) and some show a negligible difference.

Disregarding any 8gb cards (lets say 16gb cards only) we just want to look at the pcie scaling.

So, 2.0 x16 = 3.0 x8 and 3.0 x16 = 4.0x8/5.0x4. I've gone off of W1z's 5090 scaling. Ill upload the link and you can make your own assumptions from there. I've shown some of the worst examples @ 1080p which would probably best represents what the 5060Ti & 9060XT should be used for.

Please not that testing was done on a 9800X3D. Would be good to go back on something that runs pcie 3.0 x16 natively (5800X/X3D/10900k for more real-world results.


1749184941734.png
1749184979831.png


1749185040638.png
1749185079739.png


We really need W1z to test the scaling of the of these lower end cards to see what impact it does on performance, then we will be more informed about any purchasing decisions moving forward.

Cheers.
 
Last edited:
I do feel as much as a pain it is using the lower tier cards let's say 5070 below on native 3.0/4.0 systems just to give them a full picture this is really only a thing becuase both AMD and Nvidia want to maintain margins at the 300 price level which is probably 70% of the market or more.

Still anyone buying an entry level gpu should know that they need to compromise a 300-400 product in 2025 is really an entry level 150 usd or lower product from 2018 imho.

Both the 299 cards give me 1650 super vibes or 1060 3G.
 
Last edited:
Back
Top