• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

What PSU limitations? A basic 500 W unit (not the cheapest kind) can drive a 6600 XT with any CPU.

If by "Ada", you mean the 4060, then sure, I agree. The rest are too expensive, imo.
The 6800XT at the same price point as this x60ti needs much more. A 6600XT is a weak card as much as x60 is, its not even worth looking at imho. 6700 is in an okay place in that sense too, still doing ok without an out-of-tier rated PSU.

The x60 is also too expensive, its a penny wise pound stupid purchase. This gen you either go upper midrange or better or you really should not bother at all. Rather, get something dirt cheap second hand to sit out until 2025 or something.

6600XT is is in fact similar to my old 1080, seeing the relative perf. Or a 2060. Its ancient by these standards, near obsolete unless you drop the bar to the bottom. Youre looking at medium 1080p~
 
Last edited:
A 6600XT is a weak card as much as x60 is, its not even worth looking at imho.

Yes, it is. The performance gap between the tiers is extreme, RTX 4090 is 243% faster, RTX 4070 Ti is 112% faster:

1690272018434.png
 
No it proves the 4060ti is shit no matter what VRAM capacity it has because both 8 and 16GB are hamstrung by bottom tier bandwidth. Memory is not just capacity.

The better half of the Ada stack is a pointless release beyond the power efficiency gains compared to Ampere - and that only happened because Ampere is bottom tier efficiency to begin with. While having much better bandwidth.

x70/ti is going to suffer a similar fate 2-3 years down the line. Ada is lazy, badly positioned and grossly overpriced for what is a tiny die and ditto systems to support it.

Every badly balanced GPU is going to get killed by its weakest link.

No, it does prove what he just said. It's time to stop twisting reality to make influencers and some brilliant minds look better in insight. The argument made (for those not with amnesia) was that the 4060ti should have more VRAM, and praised the 3060 for it's 12GB VRAM

Now it's too late to get the genie back in the bottle, sorry.
 
Many thanks @W1zzard for the review and getting your own sample in!
 
The argument made (for those not with amnesia) was that the 4060ti should have more VRAM, and praised the 3060 for it's 12GB VRAM

Agreed.

RTX 4090 24GB
RTX 4080 20GB
RTX 4070 Ti 16GB
RTX 4070 16GB
RTX 4060 Ti 12GB
RTX 4060 10GB
RTX 4050 Ti 10GB
RTX 4050 8GB
 
No, it does prove what he just said. It's time to stop twisting reality to make influencers and some brilliant minds look better in insight. The argument made (for those not with amnesia) was that the 4060ti should have more VRAM, and praised the 3060 for it's 12GB VRAM

Now it's too late to get the genie back in the bottle, sorry.
288Gbps mate. Nuff said... you can stick to your delusions, np

The 12 GB 3060:
Bandwidth360.0 GB/s

A near 50% upgrade AND 4GB more. You have some missing GPU education, that is all you prove with your comments on this matter, sorry. Also Im not sure where your influencer take comes from, Ive been saying similar stuff about VRAM on Nvidia since Ampere released with 10GB on a 3080.
 
Last edited:
6600XT is is in fact similar to my old 1080, seeing the relative perf. Or a 2060. Its ancient by these standards, near obsolete unless you drop the bar to the bottom. Youre looking at medium 1080p~
What? 20% slower than a 3060 Ti, or 32% slower than a 4060 Ti is ancient? :wtf:
1690272748280.png


Also:
1690272935324.png


If 20% compared to the 3060 Ti means a lot, then 20% compared to the 1080 means a lot, too - it's not similar.
 
ok chill please peeps, lets stay on topic!
 
What PSU limitations?
7900 XTX eats about as much as 4090 does but is slower and by a huge lot.
7600 eats about as much as 4060 Ti but loses to 4060 speedwise.

RDNA3 is utter garbage in terms of energy efficiency compared to Ada. RDNA2 is even worse. That's why their value is lower. Not everyone is in desperate need of spending extra cash on cooling and electricity bills.

Oh and by the way, transients in RDNA2 are humongous. They are way higher than in ANYTHING else. Especially in early 6900 XTs which can perform 750 W transient spike kickflips.
 
In the article it says:
I've paid 553 Euros for the Gainward RTX 4060 Ti 16 GB, which includes 20% VAT and converts to roughly 510 USD.
Am I missing something here? 553 EUR is ~610 USD - not 510
Not sure why the VAT gets excluded, even if you import it, you'd have to add your local sales tax (except for the few states who don't collect a sales tax).
Does the author live in a state where he doesn't have to pay sales tax or am I being dumb? pls tell me
 
What? 20% slower than a 3060 Ti, or 32% slower than a 4060 Ti is ancient? :wtf:
View attachment 306168

Also:
View attachment 306169

If 20% compared to the 3060 Ti means a lot, then 20% compared to the 1080 means a lot, too - it's not similar.
To me it is; Ive honestly never upgraded for anything that isnt way over 50% up in perf. Its a complete waste of time in actual gameplay in my experience, though what matters most is where you are coming from. If you have an IGP, 6600Xt is a great place to start. If I own anything like a 2060, its pointless.

And the longer you wait the better it should get; 1080 to 7900XT is over 200% bump. Thats worth a few bucks and theyll be much better spent than periodically doing half that purchase for a smaller jump.
 
7900 XTX eats about as much as 4090 does but is slower and by a huge lot.
7600 eats about as much as 4060 Ti but loses to 4060 speedwise.

RDNA3 is utter garbage in terms of energy efficiency compared to Ada. RDNA2 is even worse. That's why their value is lower. Not everyone is in desperate need of spending extra cash on cooling and electricity bills.

Oh and by the way, transients in RDNA2 are humongous. They are way higher than in ANYTHING else. Especially in early 6900 XTs which can perform 750 W transient spike kickflips.
I was talking about RDNA 2. Just because it's not as efficient as Ada, it doesn't mean you need a kilowatt PSU for a 6700 XT. I don't understand how power consumption that was completely normal a year or two ago is now regarded as garbage just because something new is out, instead of that new thing being praised for being so efficient.

Ive honestly never upgraded for anything that isnt way over 50% up in perf. Its a complete waste of time in actual gameplay in my experience
I agree, but that makes the 6600 XT and 6650 XT way better deals than anything else in the chart that I copied in.
 
I was talking about RDNA 2. Just because it's not as efficient as Ada, it doesn't mean you need a kilowatt PSU for a 6700 XT. I don't understand how power consumption that was completely normal a year or two ago is now regarded as garbage just because something new is out, instead of that new thing being praised for being so efficient.
Ampere and RDNA2 were garbage on release tbh. Both because of transient spiking. The absolute Watt/frame wasnt stellar either, though not horrible I agree. But the spikes... painful.
 
I don't understand how power consumption that was completely normal a year or two ago is now regarded as garbage just because something new is out, instead of that new thing being praised for being so efficient.
For me, it was off for at least 5 years. My first impression when I saw Ampere and RDNA2 was, "wait a sec, I won't even turn my 40s before mid-range becomes 1 kW worth of TGP!"
 
Ampere and RDNA2 were garbage on release tbh. Both because of transient spiking. The absolute Watt/frame wasnt stellar either, though not horrible I agree. But the spikes... painful.
Shouldn't be a problem if you keep to the first rule of PC building: never cheap out on your PSU (that is, buy the highest quality unit of your desired tier).

For me, it was off for at least 5 years. My first impression when I saw Ampere and RDNA2 was, "wait a sec, I won't even turn my 40s before mid-range becomes 1 kW worth of TGP!"
With Ampere, I agree, but mid-range RDNA 2 eats about as much as mid-range RDNA 1 did, which is not great, but not terrible.
 
RDNA 2 eats about as much as mid-range RDNA 1 did, which is not great
This is more accurate.
never cheap out on your PSU
In 2016, a 500 W unit was enough for basically every card but enthusiast 1080 Ti.
In 2023, a 500 W is enough for basically only low and low mid tier cards. PSUs don't become cheaper. Their quality doesn't improve by much. GPUs become more and more Watt-happy. This sucks. And, unfortunately, it doesn't try to stop.
 
The review doesn't paint the whole picture. It is either the chip itself is so slow, that it can not use more than 8 GB, or that nvidia cheats and automatically modifies the image quality in order to fit in the available VRAM buffer.

There is a difference if you know where to look at.

View attachment 306114

Hogwards Legacy game has that issue by lowering Textures on buildings, when I am entering Hogsmeade on my old 2070 super 8GB gpu, while RX 6800 with 16gb does not have that issue for me.
 
This is more accurate.

In 2016, a 500 W unit was enough for basically every card but enthusiast 1080 Ti.
In 2023, a 500 W is enough for basically only low and low mid tier cards. PSUs don't become cheaper. Their quality doesn't improve by much. GPUs become more and more Watt-happy. This sucks. And, unfortunately, it doesn't try to stop.
GPUs are more watt-happy, more expensive, and more unnecessary. In 2010, I needed at least an x80 tier GPU to run games at decent framerates. Today, an x60 tier one is more than enough. Higher tiers are for ultra high resolutions and/or high refresh rates, not for mainstream gaming. We need to stop thinking that an X tier GPU is the same thing as it was 10 or 20 years ago.

That 500 W PSU gave you headroom for hardware for decent 1080p gaming in 2016 just like it does now.
 
GPUs are more watt-happy, more expensive, and more unnecessary. In 2010, I needed at least an x80 tier GPU to run games at decent framerates. Today, an x60 tier one is more than enough. Higher tiers are for ultra high resolutions and/or high refresh rates, not for mainstream gaming. We need to stop thinking that an X tier GPU is the same thing as it was 10 or 20 years ago.

That 500 W PSU gave you headroom for hardware for decent 1080p gaming in 2016 just like it does now.
Thats the kicker though, newer cards are infact struggling again in recent games even at lower resolutions in the x60 tier. So we are back where we started... you can cheap out on psu but you get a gpu thats weaksauce. And it still costs quite a lot.

The bottom line here is a problem we all somehow acknowledge, I think, and that is stagnation.
 
A 4060 ti doesn't get you 4k at all. And it cost up to over $500 for $16gb versions.
True that. There's much better option than 4060Ti 16GB. 6800XT with 16GB of VRAM is an entry 4K card and can be bought for roughly $500, which is a great deal. I have had one since December 2020. There are choices from AMD to tackle this heavily overpriced class 60 card. That's the inly wrong thing with it - absurd price for 1080p card. Brain hurts from Nvidia is trying to achieve here. No samples for reviews says it all.
That also means they meant to sell what is now the 4060ti at probably $600. That is a total joke
Agreed. That's why I stopped buying Nvidia cards a few years ago.
 
this card is the better joke I've ever heard of.
 
The review doesn't paint the whole picture. It is either the chip itself is so slow, that it can not use more than 8 GB, or that nvidia cheats and automatically modifies the image quality in order to fit in the available VRAM buffer.
Indeed, fps doesn't tell the whole story. If you look close in modern games you might see missing or muddy textures. But either way, 128 bit is not enough. For either version of this card.
 
  • Like
Reactions: ARF
Thats the kicker though, newer cards are infact struggling again in recent games even at lower resolutions in the x60 tier. So we are back where we started... you can cheap out on psu but you get a gpu thats weaksauce. And it still costs quite a lot.

The bottom line here is a problem we all somehow acknowledge, I think, and that is stagnation.
I've yet to see any struggle of x60 cards at 1080p, but I see your point. Nvidia has been stagnant since Turing. Ada could have been a breakthrough with its efficiency, but the chips are so small that you don't gain much on the performance front, but they still cost an arm and a leg because, you know, Nvidia. Meanwhile, AMD has been throwing idea after idea into the pot, but none of them seem to be working without issues (chiplet design with high idle power, inefficient RT cores, AI that does nothing, etc).

Although, none of this would be a problem if value was matched by price, and that's where the real problem is, imo.
 
I bought 7900XTX for MSRP in Europe in the week it launched. Fantastic 4K card. I do not feel 'milked'. I feel I invested into a great 4K gaming product for several years.


That one, yes, but there is a new one from today, testing 4060Ti 8GB vs 4060Ti 16GB models. His thumbnail says it all...

The difference is crazy o_O

1% lows 8GB = 4 FPS | 16GB = 80 FPS :rolleyes:

So much with the myth that "8GB is enough".

1690282821492.png
 
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.
If I owned a lower-end 4000 series I would avoid DLSS3 in most games because of the added latency, when paired with the low-framerate of the lower 4000 series cards means input latency could be noticable.

Cyberpunk in the TPU review for example gets 55 FPS at 1440p without raytracing. Pair that with DLSS3 and I might get "80" FPS, but the input latency will be closer to what it feels like while getting ~40 fps.

And as others have pointed out, support for DLSS3 isn't universal. You can't use it all the time even if you wanted to.
 
Back
Top