• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

3060ti 8gb vs 3060 12gb

Status
Not open for further replies.
Resolution + Textures + AA

Those three are the VRAM eaters, the rest of the settings are more or less GPU usage
So quite often all you need to do is lower from ultra to high or medium, and halve the VRAM usage with the eye candy still the same

DLSS, Image scaling, and FSR all let us render lower than native res while outputting native res - so we are finally getting the methods consoles have always had to cheat performance on PC as well.

Render at 1080p, output at 4k with sharpening and away you go. Suddenly you dont need a monster GPU, or 24GB VRAM
Very true. It's frankly about time we start being smarter about performance instead of just brute-forcing it. Computers are supposed to be kind of smart, no?

There have been quite a few games with built-in scalers before DLSS though (or after it launched but still lacking it) - they just use kind of shitty bilinear scaling, which loses a lot of quality. AFAIK that's what most console games have used too, though some use more clever solutions like checkerboard rendering.
 
Very true. It's frankly about time we start being smarter about performance instead of just brute-forcing it. Computers are supposed to be kind of smart, no?

There have been quite a few games with built-in scalers before DLSS though (or after it launched but still lacking it) - they just use kind of shitty bilinear scaling, which loses a lot of quality. AFAIK that's what most console games have used too, though some use more clever solutions like checkerboard rendering.
The games could cheat a little, rendering the 2D elements and HUD at 4K, with the game at lower res. That allowed text to be crystal clear at all times, with the 3D elements dynamically changing.
I've seen that in quite a few modern games already (D2 resurrected has it, working well too) so i guess thats how things will progress now


To make games sell, they have to make it prettier than the last game. But now with the parts shortage, they cant - so to avoid losing sales they're working on performance, for a change
 
The games could cheat a little, rendering the 2D elements and HUD at 4K, with the game at lower res. That allowed text to be crystal clear at all times, with the 3D elements dynamically changing.
I've seen that in quite a few modern games already (D2 resurrected has it, working well too) so i guess thats how things will progress now


To make games sell, they have to make it prettier than the last game. But now with the parts shortage, they cant - so to avoid losing sales they're working on performance, for a change
True. But developers are also always pushed to finish the games ASAP, which leads to quick fixes and poorly optimized code, not to mention insufficient time to learn how to make the most of the hardware. The quality changes from early to late games across any console generation demonstrates this beautifully - the difference between early and late PS3 or X360 games is downright staggering. Just goes to show what you can do with the same hardware if you have enough time to learn how to really squeeze it. In some way, DLSS/FSR allow for a kind of third path here - no hardcore optimization; no brute force quality improvements, but a brand-new route that can incorporate both but is its own thing still. Given that lithographic density and transistor count increases are bound to taper off sooner rather than later, getting smarter about this is really about time.
 
Ti is a cherry chip and i thought that componrnts between the 2 were cut on the non ti card
 
Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR. For those wondering, I entered the wait list March 3, 2021.
Ti is a cherry chip and i thought that componrnts between the 2 were cut on the non ti card
Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.
 
Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR. For those wondering, I entered the wait list March 3, 2021.

Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.

Glad to hear you were able to order!

I've got notifies pending for two 3080's, one from October 2020 and the other from November 2020. I don't really want them anymore, but my sister and her husband may need a card when/if they're ever in stock.

Congrats again!
 
Glad to hear you were able to order!

I've got notifies pending for two 3080's, one from October 2020 and the other from November 2020. I don't really want them anymore, but my sister and her husband may need a card when/if they're ever in stock.

Congrats again!
Yeah the 3080 would be nice, but so far beyond my budget. I usually buy in the $300-400 card segment.
 
Well, my queue notify occurred and I ordered an EVGA RTX 3060 XC Black 12GB: 12G-P5-3655-KR. For those wondering, I entered the wait list March 3, 2021.

Don't know if this is what you are referring to but the 3060 is GA106, while the 3060TI is cut down from 3070's GA104.

Congrats. Hopefully it gives you some improved performance you're looking for.
 
It's a whole 2000 bigger than your 1060, so it's gunna be quite the upgrade
 
Turns out FF7 Remake PC needs 11-12 gig minimum to not lose performance from excessive texture eviction from VRAM. Seems a poor port in general though.

Maybe I will upgrade my 3080 to the 3060 12 gig?
 
Turns out FF7 Remake PC needs 11-12 gig minimum to not lose performance from excessive texture eviction from VRAM. Seems a poor port in general though.

Maybe I will upgrade my 3080 to the 3060 12 gig?

Even with 12GB is feels like shit compared to the PS5 version don't waste your time. There are stutters even on 24GB cards.
 
Even with 12GB is feels like shit compared to the PS5 version don't waste your time. There are stutters even on 24GB cards.

I forgot to reply check this link.

FF7R texture swapping

It seems the same as older SE FF ports, stutters due to evicting textures to make room for new ones in VRAM, the guy found a way to disable it so they all stay cached, and it fixes it until of course the VRAM fills up.

Lightning returns was very extreme with the issue, it was coded to run on machines with just 300MB of ram, amd would stutter every time moving in certain field areas as it was unloading and loading new textures.

I will speculate, with the PS5 version the dev's had assurances everyone has 16gigs of GDDR so could do less aggressive texture swapping, and they have probably coded this game for low end hardware. Due to some video cards on the market having pitiful amounts of video ram. Even the original PS4 had more memory than low end PC cards.

Next time you play it, check if it actually utilises all your VRAM and try his trick, I am curious if it helps you. :)
 
Last edited:
Even the original PS4 had more memory than low end PC cards.
The original PS4 had more Vram than a freaking 780Ti, you know, the flagship 700$ Nvidia card in 2013 for gamers.

I'm not seeing that. 8GB card. You're having another problem. Perhaps your settings are not optimal.
Are you playing the FF7 Remake right now ?
i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s
 
Are you playing the FF7 Remake right now ?
i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s
My son has it. He had framerate issues to begin with but when we turned on vsync and adjusted a few other settings it was solid. Granted, he's on 1080p. Maybe the game engine is choking on 4k?
 
Last edited:
The original PS4 had more Vram than a freaking 780Ti, you know, the flagship 700$ Nvidia card in 2013 for gamers.


Are you playing the FF7 Remake right now ?
i'v seen plenty of people including You tubers complaining about major stuttering issues even on 3090s
The 780ti also aged terribly due to that 3GB.
 
The 780ti also aged terribly due to that 3GB.
OG titans with 6GB haven't fared any better, kepler's complete lack of compute didn't age well at all. the 7970 with 3GB starting slapping both of them around years ago.
that's how both kepler, maxwell and with anemic hardware support, pascal had such good power consumption since compute is a power hog.

mantle/DX12/vulkan were the nails in those coffins. :)
 
that's how both kepler, maxwell and with anemic hardware support, pascal had such good power consumption since compute is a power hog.
are you saying Maxwell and Pascal are power efficient because they sucked at performance ?

The 780ti also aged terribly due to that 3GB.
It surely did age terribly, not even a year later Shadow of Mordor recommended of 6gb of vram for its hd texture pack, imagine paying 700$ then be told you cant play at high resolution texture packs a year later, lol
Nvidia want to deliver just the bare minimum amount vram for cost cutting measures and its sad to see people not see trough this, pascal was an anomaly that i dont think Nvidia would
repeat again, amazing performance to wattage ration and plenty of vram, why people are keeping their 1060s still ? because its a decent performing card and the 6gb frame buffer allowed it to last the test of time, the 1080ti was so good you can even shug along with it till the mid of this new console generation
 
Last edited:
my bad, i got confused since you mentioned Kepler too, those were not efficient, maxwell almost halved the power consumption over it
yeah, sorry i'm starting to celebrate my BD (12/22) a few hours early and my grammar/syntax will be more american than usual. :)
 
I'm not seeing that. 8GB card. You're having another problem. Perhaps your settings are not optimal.

On both my 3080 ti and 2080 ti it's bad at all 3 resolutions. My buddies with 3080s and 3090s are also mentioning stutters but I haven't been able to swing by their places to see if it's the same issues I'm seeing in person yet.

Don't get me wrong it's playable but frame times are still worse than the PS5 version which I also own. It's just a bad port in general.... They've also ruined the hdr when compared to the Playstation versions. Apparently running the game in DX 11 helps but I haven't had a chance to try it.

This is pretty much what I'm seeing if anyone else is wondering how bad the port is.

 
Last edited:
For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.
 
So a 3060 12GB card for those who cant afford an RTX card for scientific computing.
Not true at all.

The 780ti also aged terribly due to that 3GB.
There was a 6GB version that was fairly popular. And it aged a lot better.

This is pretty much what I'm seeing if anyone else is wondering how bad the port is.
I think it's a beautiful port. Not everyone is having that glitch and in every other way it is an amazing game. Let's not blow things out of proportion folks.

That said, we're off-topic. Glitches with FF7Remake should be taken to a new thread f the discussion needs to progress. I'd be happy to open such a thread if everyone would like.
 
Last edited:
That's a great idea actually.
better to get 3070 in such case, but, if money isn't the issue, then tou could buy pentium, sell it and buy i7 :D

That's a great idea actually.
great, if you sell 3060 for full price you got it... :D

I've been in EVGA queue for both of these products for almost a year. The 3060 12GB is likely to come up first. I have a question regarding predicting which is more likely to have staying power ~4 years out from now. I tend to run stuff a really long time. The 3060ti processor is much faster but the 12GB ram is possibly better in this aspect if lower vram amounts end up being a limiting factor. What do you guys think?

The main reason I ask is in 2013, I once purchased a 770 2GB instead of the 4GB because I was told the card wasn't really fast enough to matter. After a few years, the limited VRAM became an issue that prevented me from running some games. Similarly, the 780ti's 3GB severely limited that card just a couple years after its launch, despite being close to a 980 in performance otherwise.
2GB and 4 isn't same as 8GB and 12. Get Ti, i have it and it handles 1440p Ultra and 4K medium (Hitman 3, Days Gone i play). I've compared gameplay of Hitman 3 with friends' 3060 and 3060 is trash vs my Ti lol :)

For gaming the 3060ti for sure, for AI and ML the more vram the better. So a 3060 12GB card for those who cant afford an RTX card for scientific computing.
for scientific you need Tesla or whatever they offer lol....
 
Status
Not open for further replies.
Back
Top