• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

It's a shame TPU doesn't do productivity testing on GPUs, since non-gaming workloads are just about the only ones where a 16GB card is worth looking at over the 8GB 4060 Ti.

From what I've read on other sites that do productivity testing, the 8GB variant is pretty bad and hamstrung by its very limited memory bandwidth - bad enough that in most workloads you're better off with the last-gen 3060 Ti instead. However, adding more VRAM will definitely allow larger datasets and workflows to actually run on the card, though not at competitive speeds or costs.

To those with the luxury of buying used GPUs for productivity, the 2080Ti is still a strong productivity option at 30-40% lower cost than a 4060Ti 16GB.
 
@W1zzard I hate to be this guy, but you may want to mention in your conclusion that at this time only mentioning performance differences is problematic as HUB has already demonstrated that some games work around this by not loading textures at all, forcefully downgrading them or having textures pop-in after some time.

I definitely still agree with the conclusion that 500$ for a 16GB 4060Ti is still too much though.
 
The difference is crazy o_O

1% lows 8GB = 4 FPS | 16GB = 80 FPS :rolleyes:

So much with the myth that "8GB is enough".

View attachment 306180
That is specifically testing only the scenarios that overwhelm the 8GB card and in the limited time he's had with it so far, he decided to focus on the worst-case scenarios where 8GB was a proven issue.

Extremely valid testing and results, but not representative of the whole picture. If anything it's an indicator of how the 8GB cards are going to look crippled and terrible in 2024. The more complete suite of testing with his usual 12 or 50-game benchmarks is still to come, but we're expecting those results to mirror @W1zzard's review.

Now that both HUB and TPU have 16GB variants, it'll be really useful to compare 8GB and 16GB variants of the same GPU in future games.
 
More and more games will need lots of VRAM. More VRAM means more future-proofing. AMD chose to equip the RX 6800 with as much as 16 GB and it will pay off in the long run - fine wine.

1690286096542.png

1690286040321.png

 
In the article it says:

Am I missing something here? 553 EUR is ~610 USD - not 510
Not sure why the VAT gets excluded, even if you import it, you'd have to add your local sales tax (except for the few states who don't collect a sales tax).
Does the author live in a state where he doesn't have to pay sales tax or am I being dumb? pls tell me
Well in Canada we don't add the sales tax to retail price when we do the conversion, so why should a vat be included? Its the same thing except baked into the price rather than added at point of sale. Whether you 'consider' the price, including the tax, or excluding the tax, is kind of regional thing. Like in Canada, we definitely exclude the tax. Sounds like the article writer, is just trying to do the same thing.... as tax rates are going to vary from place to place.
 
In some countries the law is quite strict - all prices must always include VAT - I guess this is done so not to mislead the customers who would need to calculate themselves otherwise.
 
In the article it says:

Am I missing something here? 553 EUR is ~610 USD - not 510
Not sure why the VAT gets excluded, even if you import it, you'd have to add your local sales tax (except for the few states who don't collect a sales tax).
Does the author live in a state where he doesn't have to pay sales tax or am I being dumb? pls tell me
USD prices are always quoted without tax. According to NVIDIA, the MSRP for the RTX 4060 Ti 16 GB is $499.

I live in Germany where 20% VAT is added. So 460 EUR + 93 EUR (20% VAT) = 553 EUR. Converting 460 EUR to USD = 508 USD
 
USD prices are always quoted without tax. According to NVIDIA, the MSRP for the RTX 4060 Ti 16 GB is $499.

I live in Germany where 20% VAT is added. So 460 EUR + 93 EUR (20% VAT) = 553 EUR. Converting 460 EUR to USD = 508 USD
If you dont mind me asking how much did you pay ? :respect:
 
so a little cheaper here in spain

1690288823603.png
 

Attachments

  • Screenshot_2023-07-25-15-56-12-58_40deb401b9ffe8e1df2f1cc5ba480b12-01.jpeg
    Screenshot_2023-07-25-15-56-12-58_40deb401b9ffe8e1df2f1cc5ba480b12-01.jpeg
    415.5 KB · Views: 58
...not loading textures at all, forcefully downgrading them or having textures pop-in after some time.
This is just bad loading strategy of texture MIP levels from software perspective.
If your camera frustum is pointed down to ground with just few plants, and the plants textures are still blury even after few seconds, it is 1000% developer faul...so it would be unfair to mentioned that in conclusion, because even if you have GPU with very low amount of VRAM, swapping textures between SSD/RAM and VRAM is crazy fast through PCIe.

compare 8GB and 16GB variants of the same GPU in future games.
zWORMz Gaming is constantly testing older cards on new games, so it will be interesting to see this card 5+ years from now.
Heavily modded games can take advantage from 16GB already today, especialy in titles which does not support MIP map streaming.
 
Last edited:
4070 panther € 599, 4060 Ti 16 panther - € 549, madness. just give me a regular 4060 at 299 lol
 
The difference is crazy o_O
1% lows 8GB = 4 FPS | 16GB = 80 FPS :rolleyes:
So much with the myth that "8GB is enough".
Exactly. I hope everybody who has seen that video and other videos with gameplay on display realize finally why Nvidia keeps people on low VRAM cards. This is to increase the number of customers who would upgrade sooner rather than later.
Poor 1% lows on 8GB cards lead to terrible stuterring in increasing number of games.
I hope this is the very last generation of 8GB cards from both vendors.

Now that both HUB and TPU have 16GB variants, it'll be really useful to compare 8GB and 16GB variants of the same GPU in future games.
We will not discover America again with such tests. TPU and HUB initial reviews have showed what we have already known in general, namely that 8GB cards must die in the next gen of desktop GPUs. It's as simple as that. 5060 and 8600 will need to have 12GB minimum. Even 5050, if ever released, needs at least 10GB.
 
If I owned a lower-end 4000 series I would avoid DLSS3 in most games because of the added latency, when paired with the low-framerate of the lower 4000 series cards means input latency could be noticable.

Cyberpunk in the TPU review for example gets 55 FPS at 1440p without raytracing. Pair that with DLSS3 and I might get "80" FPS, but the input latency will be closer to what it feels like while getting ~40 fps.

And as others have pointed out, support for DLSS3 isn't universal. You can't use it all the time even if you wanted to.

Did you ever try playing with DLSS3 and frame generation yourself before saying this? I have a 4070, i've played numerous games with DLSS3 and FG, even FPS like cyberpunk and i can't feel any noticeable added latency. And when i was playing fps before i could notice a difference between 30ms and 60ms ping in multiplayer games so when the latency is higher i usually detect it pretty fast.

It's 1 thing to talk about what you read in papers, it's another thing to experience it yourself. I'm pretty sure if i would put you on a computer with DLSS3 and FG and another one without it you wouldn't be able to tell the latency difference.
 
It's having trouble using the full 16GB effectively due to the clamshell design and the pathetic 128 bit bus?

A 3070 sees huge gains when modded from 8GB to 16GB, though it doesn't have comically low bandwidth.
 
Last edited:
Disgusting, I dont even know why the 8GB version exists and why is this 500 usd. The right price would be around 300 usd for this.
 
The 6700XT is just a better option.

16GB for a mid range card is useless. It's like a Polaris 4GB vs 8GB - the benefit is really minimum. The GPU itself is'nt strong enough to handle the extra amount of memory.

And on top of that; users with PCI-E 3.0 still are limited to X8 lanes, not X16. It's just a 100$ tax for more memory which proves no addition to the 8GB model.

Nvidia is going Apple it seems. 100$ more for a 128GB extra.
 
Did you ever try playing with DLSS3 and frame generation yourself before saying this? I have a 4070, i've played numerous games with DLSS3 and FG, even FPS like cyberpunk and i can't feel any noticeable added latency. And when i was playing fps before i could notice a difference between 30ms and 60ms ping in multiplayer games so when the latency is higher i usually detect it pretty fast.

It's 1 thing to talk about what you read in papers, it's another thing to experience it yourself. I'm pretty sure if i would put you on a computer with DLSS3 and FG and another one without it you wouldn't be able to tell the latency difference.
You'd only notice it at low framerates. The lower the performance GPU, the more noticeable the increased input latency of DLSS3 will be. Once you get down to sub-60 FPS with DLSS2, it will be noticeable, as DLSS3 essentially reverts your input latency right back to where it was at native resolution, sometimes worse, rarely better. In other worse, it means your base input latency would be about where the ~30-40 FPS input latency would be if you were getting 60 FPS with DLSS2 and switched to DLSS3.

You also seem to be confused on what exactly DLSS3 is. DLSS3 is DLSS2+FG. You don't need to clarify FG, it's the only distinguishing factor between DLSS 2 & 3.
 
This is just bad loading strategy of texture MIP levels from software perspective.
If your camera frustum is pointed down to ground with just few plants, and the plants textures are still blury even after few seconds, it is 1000% developer faul...so it would be unfair to mentioned that in conclusion, because even if you have GPU with very low amount of VRAM, swapping textures between SSD/RAM and VRAM is crazy fast through PCIe.


zWORMz Gaming is constantly testing older cards on new games, so it will be interesting to see this card 5+ years from now.
Heavily modded games can take advantage from 16GB already today, especialy in titles which does not support MIP map streaming.
This is just a horrible take my guy. Based on whether the developer did a proper job or not you're going to get a good experience or a suboptimal one. This is either in low(er) fps, texture issues or a well optimized experience.
Just saying it's the fault of the developer and not something to note in the conclusion is some level of cope and delusion.
Reviews are here to tell us the truth disregarding all marketing bias and questionable truth that the companies spit out. Not whether it's the fault of the developer or Nvidia.
Not mentioning this makes the conclusion inconclusive.
 
@Nostras It is better to mention that in game performance review, and not HW performance review.
 
@Nostras It is better to mention that in game performance review, and not HW performance review.
I disagree. People will quote/refer to this review when asking/talking about 8GB vs 16GB. This is not information that should be consigned to the game review individually.
As the review is right now I will only use the HUB review as this review is missing some rather critical information.
Good day.
 
16GB is much better than 8GB if you are going to use it for VR. But yeah it's too expensive like everything else.
 
@Nostras Game performance reviews are much more helpful than GPU HW reviews, because they are pointing on things you can't know not even on a theoretical level.
I never decided based on HW's review, because everything what I need to know is just card parameters and two fundamental knowledges: "how GPU works" and "how game code works".
Make decisions based on fps can only negatively affects the reputation of the graphics card.
 
The 6700XT is just a better option.

Yes.

16GB for a mid range card is useless.

No. 16 GB in this case is essential, otherwise you hit a brick wall because 8 GB introduces severe performance limitations.
It could have been better with 10, 11, 12 or 14 GB, but nvidia makes wrong decisions all the time...
 
Back
Top