• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4060TI Questions?

Joined
Jun 2, 2017
Messages
9,828 (3.35/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Looks like MSI Gaming will be live streaming the 4060TI 16GB card today at 10 AM Eastern. As far as I know this is the only channel that has 4060TI but since it's a live stream Questions can be revealing.
 
what questions? what revelations? it's the same card with more 8GB vram for 100$ more.
 
  • Like
Reactions: N/A
what questions? what revelations? it's the same card with more 8GB vram for 100$ more.
Watching right now and it is slower than the 4060 8 GB in every Game so far.
 
Watching right now and it is slower than the 4060 8 GB in every Game so far.

How can 4060 be faster than 4060 ti? As I'm not watching it, dunno what comparison they are doing, but I doubt that non ti is faster than ti variant.
 
what questions? what revelations? it's the same card with more 8GB vram for 100$ more.

I wanted to see what piggybacked memory looks like. or more like whats was the other word for it and does it get hotter
 
How can 4060 be faster than 4060 ti? As I'm not watching it, dunno what comparison they are doing, but I doubt that non ti is faster than ti variant.
Sorry it was indeed the 4060TI 8 GB.

I wanted to see what piggybacked memory looks like. or more like whats was the other word for it and does it get hotter
The livestream is about 2.25 hrs long but the Games showed no difference using 8GB vs 16GB and in some cases the 8GB variant is faster. This card is a total fail for $499.
 
The livestream is about 2.25 hrs long but the Games showed no difference using 8GB vs 16GB and in some cases the 8GB variant is faster.
Kinda makes you wonder why they bothered with the livestream, eh? All they did was put egg on their own faces.
This card is a total fail for $499.
Oh I knew that was going to happen. I don't know what drugs they were on when they decided to price it such that it would be squaring off with the RX 6800 XT.

When I saw that the RTX 4060 Ti 16GB would be going up against the RX 6800 XT, I predicted a "Romulan Bloodbath" (lots of green blood flowing) because the RX 6800 XT also has 16GB of VRAM and is a whopping 32% faster (which is like two performance tiers). That's like putting a GTX 1060 up against a GTX 1080 for the same price. It's so bad of an idea that I didn't really believe that nVidia would try it, I thought that they'd pull an "RX 7600" and drop the price by like $30-$50 at the 11th hour, but here we are...
 
Where are the people normally championing VRAM?

I thought 8 GB "wasn't enough"?

I'm guessing the extra GDDR6 chips eat into the power budget, similar to how the 3090 was somewhat core starved due to it's 24 x 1 GB chips. The 3090 Ti was faster not just from the added cores, but the 12 x 2 GB chips as well, allowing the core to use more of the power budget.
 
Where are the people normally championing VRAM?

I thought 8 GB "wasn't enough"?

I'm guessing the extra GDDR6 chips eat into the power budget, similar to how the 3090 was somewhat core starved due to it's 24 x 1 GB chips. The 3090 Ti was faster not just from the added cores, but the 12 x 2 GB chips as well, allowing the core to use more of the power budget.

based on clickbait youtubers and games that are complete disasters of poor optimisation.
Time to stop watching the idiots.
 
Where are the people normally championing VRAM?

I thought 8 GB "wasn't enough"?

I'm guessing the extra GDDR6 chips eat into the power budget, similar to how the 3090 was somewhat core starved due to it's 24 x 1 GB chips. The 3090 Ti was faster not just from the added cores, but the 12 x 2 GB chips as well, allowing the core to use more of the power budget.

You kind of answer your own question. If the 16GB card doesn't have better performance it's down to factors outside the VRAM size like power budget. It's already been proven on multiple occasions that 8GB is detrimental to performance as compared to 12 or 16GB. This is even given that people have argued that such cards were not powerful enough to benefit (another disproven argument).


based on clickbait youtubers and games that are complete disasters of poor optimisation.
Time to stop watching the idiots.

Character assassination is no substitute for a legitimate basis for an argument. GamersNexus already debunked the poor optimization line of argument. You cannot expect game devs to stick to 8GB for the more than 8 years that they have had to. People always make this argument when VRAM requirements increase in games and it has yet to be proven correct a single time.


On topic, if the 4060 Ti 16GB does turn out to be a worse card due to power budget issues, that's entirely on Nvidia and may even be intentional to upsell users to more expensive cards.
 
Where are the people normally championing VRAM?
We're waiting for some legitimate reviews, for starters. Second, VRAM size is only one part of the problem. VRAM bandwidth outside of L2 cache is another.
 
We're waiting for some legitimate reviews, for starters. Second, VRAM size is only one part of the problem. VRAM bandwidth outside of L2 cache is another.
I look forward to reading some of these more legitimate reviews (I categorically ignore all the YouTubers).

I assume the reviewers are all going to come to the same conclusion: the additional memory is hampered by the 128-bit memory bus and the $100+ upcharge isn't worth it. Nvidia released some slides and even their biased spin is pretty ugly.

From a performance-per-dollar metric, the 4060 Ti 16GB looks to be an abject failure compared to the 8GB model which is a poor value anyhow.

Nvidia knows all this which is why they pushed the 4060 Ti 16GB off the loading dock with zero fanfare. The 16GB model will undoubtedly have a handful of usage cases where the extra VRAM makes sense (I dunno, maybe content creation) but it looks like a dud as a consumer gaming product.
 
GamersNexus already debunked the poor optimization line of argument. You cannot expect game devs to stick to 8GB for the more than 8 years that they have had to. People always make this argument when VRAM requirements increase in games and it has yet to be proven correct a single time.

No he did not! In fact the last of us has improved a lot with 8gb of vram, it's not even an issue anymore.
 
Where are the people normally championing VRAM?
Championing an extra 8GB of VRAM for an extra $100USD is just plain stupidity.
I thought 8 GB "wasn't enough"?
Oh it's perfectly fine.... as long as you only game at 1080p (which admittedly most people do). However, 1080p gaming cards should never cost more than $250USD and it becomes a problem when companies try to get $300USD+ for an 8GB 1080p card. The product itself isn't the problem because, after all, nVidia does make great products. The problem is the price that they're trying to get for it.

To be perfectly honest, I really don't care what nVidia does because I'm perfectly happy gaming with Radeons which means that their pricing doesn't affect me directly. I have a kind of immunity to Jensen's pricing schemes. I'm not completely immune because AMD just follows their lead (like a bunch of dumb lunatics) but it's not as bad as paying actual GeForce prices. The babies who whine and cry about GeForce pricing only to go and buy yet another GeForce card (it's both funny and sad at the same time) are the ones who suffer the most but it's suffering of their own making. What annoys me about nVidia is how Jensen consistently insults the intelligence of his customers (it's just not a good look). What annoys me even more is how his customers consistently prove that he's not wrong (which means that I can't really blame him).

In this case however, based on the pitiful sales numbers of the RTX 4060 Ti 8GB and RTX 4060, Jensen may have gone too far with the RTX 4060 Ti 16GB at $500USD. The idea that, at the $500USD price-point, the RTX 4060 Ti has better than a snowball's chance in hell at competing with the RX 6800 XT is so ludicrous that even the most green-blooded fanboy won't be able to justify buying it. This time, I don't think that Jensen's gambit will pay off.
 
Last edited:
No he did not! In fact the last of us has improved a lot with 8gb of vram, it's not even an issue anymore.

Improved but still have issues. Just like RE4, Hogwarts legacy, and Forspoken still have issues on 8GB. You can very clearly see textures being swapped in an out due to a lack of VRAM. Forget about using RT, that's simply not recommended. That's right now, let alone in the future. The 4060 Ti is a $400 video card, it's absolutely not acceptable for it to be encountering these issues out of the gate.
 
Improved but still have issues. Just like RE4, Hogwarts legacy, and Forspoken still have issues on 8GB. You can very clearly see textures being swapped in an out due to a lack of VRAM. Forget about using RT, that's simply not recommended. That's right now, let alone in the future. The 4060 Ti is a $400 video card, it's absolutely not acceptable for it to be encountering these issues out of the gate.

the usual quartet of shitty optimisation
 
much like CPUs, you judge the GPU on performance not just one cherry picked spec
 
clamshell memory requires a total redesign of the pcb no wonder it's more expensive.
 
The PCBs are 90% the same. nVidia would just like you to believe it's complex
 
I thought 8 GB "wasn't enough"?
Well its not for the newest AAA games. But there's more than just one variable to consider. When we said we wanted 16GB, we didn't mean on a 128 bit bus.
 
Well its not for the newest AAA games. But there's more than just one variable to consider. When we said we wanted 16GB, we didn't mean on a 128 bit bus.
Sounds like the same crap ngreedia pulled on the 970.
 
the usual quartet of shitty optimisation
We can't expect game developers to spend all this extra time and energy optimizing for decade old memory configs forever. Especially now that current gen consoles have more, even though its unified. 8GB had its time in the sun ( a long time infact). But its time to move on...
 
We can't expect game developers to spend all this extra time and energy optimizing for decade old memory configs forever. Especially now that current gen consoles have more, even though its unified. 8GB had its time in the sun ( a long time infact). But its time to move on...

oh yes, all those decade old 3060, 3070, 6600, 4060, 4070 and 7600. The owners of those decade old cards should get a job and buy some new ones, stop living of welfare, pull themselves from the bootstraps and buy a 4090, they are so incredible accessible.

lowlife gamers disgust me tbh
 
oh yes, all those decade old 3060, 3070, 6600, 4060, 4070 and 7600. The owners of those decade old cards should get a job and buy some new ones, stop living of welfare, pull themselves from the bootstraps and buy a 4090, they are so incredible accessible.

lowlife gamers disgust me tbh
I said decade old memory configs... not cards. Half those cards dont have 8gb anyway... so not quite sure your point is.

My point is that the industry is moving on, and 12gb is becoming the new standard. If a game can run on a 8GB at 1080p or 1440p then thats great. I'm just saying, those days are coming to an end. We knew this was going to happen 2 years ago and the time is upon us.

Idk how many times we can blame it on 'optimization' when the reality is, these games are primary made for consoles with 16GB gddr unified memory, with I believe 10.5GB available on the xbox for graphics and technically no limitation on the ps5... but it does need to store other kinds of data and what not, so its around 12gb available for graphics. It takes a lot of work to get those quality of textures on a buffer of 8gb, especially with pcs being less efficient with their memory use in general. Its been done in the past but developers have straight up said their new targets are 12GB for high settings.

And nowhere did I say you need to buy a 4090. So thats just strawman. You have plenty of cheaper options like the 6700 (10GB), 6700xt (12GB), 4070 (12GB), 6800(16gb), 6800xt (16GB), 6950xt(16GB), A770 (16GB). Or a used 2080 ti (11gb), 3080 ti (12gb), 3090(24gb). You have lots of options and don't need to break the bank to leave 8gb behind.

Or if you dont mind low quality textures on new games, thats fine too. I still use a computer with 32mb of vram for old games and emulators. But if we're talking new AAA games at high settings ( even 1080p in some cases) 8gb aint cutting it anymore. Not without a lot of extra work, that developers aren't going to keep doing forever.

So no, I don't think $400 USD 8GB cards are acceptable. Nobody who wants to play new games at high settings going forward, should buy one. Same goes with the 16GB version, but for different reasons.
 
Last edited:
Back
Top