• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

for those who think 12gb vram can max out everything

It isnt enough for all games without RT and max VRAM settings, you pretty much side stepped my points by saying that. Sure the game will run, doesnt mean its enough though.
Well for example according to HUB's video nothing crosses 12gb without a combination of maxed out RT and DLSS Frame gen, and that's on a 16gb card.

 
of course you can buy a lot of amd cards with 16gb and as well arc a770. perhaps Intel can make b770 >16gb any time now.
but if cyberpunk is 5 years old and take over 12gb vram think about star wars outlaws with preset outlaws. I can bare fit it in 16 gb vram. I can benchmark later by request.
I'm not here to debate about street prices because I take them the way there are in present moment
I'm here to tell people 12gb vram isn't enough right now.
Nah, even 16Gb is clutch, the 4090 is the only futureproof GPU, and the only decent way of gaming on PC. If you can't max out everything might as well buy a console right ?
1737285906253.png
 
Well for example according to HUB's video nothing crosses 12gb without a combination of maxed out RT and DLSS Frame gen, and that's on a 16gb card.

All he is doing there is measuring performance, and VRAM usage. As I said you didnt address my points. I also think using a 16 gig card is not a proper test, dont know why he did it that way. Also reviewers tend to use a bare bones OS, dont have background things running like a browser or discord. Discord does use VRAM in its default mode.
He has also done videos showing that you cant trust in game settings e.g. a game you set high/ultra textures, it appears to run fine, but then when you examine then its a lower quality texture loading in the game.

Not sure this bothers you so much, as some people do seem sensitive to the VRAM argument, I think you can buy a 8/12 gig card and use it, but I do have an issue with recommending expensive cards that have less than 16 gig, as it massively shortens the lifespan of a card and may have issues in modern games, thats all really.
 
Last edited:
One game, one scene, one set of settings chosen, on one card.

12gb isnt enough bois, time to get over it.

Yeah orrite.
 
Well for example according to HUB's video nothing crosses 12gb without a combination of maxed out RT and DLSS Frame gen, and that's on a 16gb card.

Any talks about Vram are often seen with the scope of "future-proofing". 12GB might be fine now, but it's too close to the limit to give peace of mind. Assuming that Vram requirement keeps growing in the next few years, and you'll have issues with textures/assets loading. From my personal exp, there's only a handful of games where 8GB is causing problems and even fewer games where fixing those issues would result in a noticeable downgrade in texture quality. "my 2020 GPU can only handle early 2020 level of textures in 2025, yuk"

Then there's games like the last of US who where really unpolished at launch. They somehow managed to make better-looking textures that used less VRAM in later patches :eek:
1737287738530.png

But both AMD and Nvidia are looking at neural compression/decompression of textures to lower the impact on VRAM, FG is also getting optimized to lower the memory footprint.
 

Attachments

  • 1737287678877.png
    1737287678877.png
    3.8 MB · Views: 62
Last edited:
i agree with most opinions, gamers in 2025 are absolutely insane (much like the companies making gpus), there is no need to turn all sliders to the right, just enjoy your games for the fun aspect not for the 4k shadow of every single leave or the ray of light bouncing on something, no need to see every reflection on a puddle of water or a shiny floor. Idk in my daily life it doesn't rain much here and shiny floors are rare, no one has that crap in their homes.

If your paying that much attention to the shadows or the floors odds are the game is not that fun
 
What a daft topic. "It's the year X, Y VRAM is no more enough, one needs at least triple VRAM to be future proof."

Yeah, sure. The more the merrier. Thousands upon thousands games from yesterday you can play just fine using a little cheeky 6700 XT / 3060 Ti, or even worse than that. Dozens games from today you can play, sometimes lowering the settings, just fine on these as well. Or without going that ham on settings if you have a 4070 or better. Not the point anyway.

We're into the state of the market when VRAM amounts grow much slower than calculating power and it makes people go apeshit: similarly priced GTX 1070 and RTX 4070 Super are about three times different in performance, however the latter only has 1.5 times the VRAM.
1737287958702.png


Would increasing 4070 Super's VRAM amount to 24 GB help? Sure. It'd make VRAM hog games and most prosumer workloads a lot more comfortable. But in the absolute most of the games it would be the exact same experience.

Is 12 GB a bad VRAM amount today? Depends on the price, GPU performance and what you wanna achieve. My 6700 XT should've been at least 70% faster for me to reasonably max its VRAM capacity out. 4070 Super is even faster than that.

//also Cyberpunk 2077 is perfectly playable on 12 GB GPUs, played it on a homie's PC with a 3080 12 GB and it was flawless in all regions (3440x1440, DLSS: Balanced, RT Reflections enabled) except for CPU intensive ones (9600K, even at 5.2 GHz, isn't really a strong CPU by today's standards).
 
//also Cyberpunk 2077 is perfectly playable on 12 GB GPUs, played it on a homie's PC with a 3080 12 GB and it was flawless in all regions (3440x1440, DLSS: Balanced, RT Reflections enabled) except for CPU intensive ones (9600K, even at 5.2 GHz, isn't really a strong CPU by today's standards).

CP77 works fine on 1440p with 8GB, the problem is the rays and crap. I do get the game turned into a visual safary, more than a proper gaming experience thanks to Nvidia but you don't need that crap to play the game.

i guess it wouldn't work on 4K but you shouldn't be trying that on a 8GB card anyway. So i see no issue with that game.
 
i guess it wouldn't work on 4K but you shouldn't be trying that on a 8GB card anyway
3070 Ti + SSR High instead of Ultra + DLSS Quality + zero RT + ReBAR OFF = flawless 4K60 gaming experience. Played 7 hours non-stop, also visiting D-town. Enabling RT at 4K made it unplayable of course.
 
All of graphics cards with less than 48GB of VRAM are useless trash because I want to run Stable Diffusion in a reasonable resolution.
But seriously, my 10GB 3080 never got close to filling all of the memory in a game, and that's in 3840x2160.
The lesson here is don't assume your personal needs or views are universal. They are not.
 
Last edited:
The lesson here is don't assume your personal needs or views are universal. They are not.
it's not personal views. it's independent benchmarks, independent from my views. you want to max everything, 12gb ain't enough and that's a fact. benchmarks aren;'t views, are facts, scientific measurements if you want.
the implications here is for instance rtx 4070/super, rtx 5070, rx 7700 xt, b570, b580 aren't great in present, not mentioning long term (at least 2-3years).
the're simply budget cards at big prices . check here prices for 8gb gddr6 https://www.dramexchange.com/ and see how much would cost to add 8gb gddr6
draw your own conclusions
 
Is that allocated VRAM or actual, in-use VRAM? ;) There's a big difference
I think the point here is the 16 1% low FPS.
 
Last summer and in to autumn i played and finished Cyberpunk 2077 and the expansion two times. Once with a male V and another with a female V. It took me about 250 hours combined. I have a RTX 4070, used path tracing, DLSS on performance all settings maxed out and frame gen. Not once i did say to myself that 12GB were not enough and i should stop playing one of the best games i have ever played because of some frametime graph.
 
Last edited:
it's not personal views. it's independent benchmarks, independent from my views. you want to max everything, 12gb ain't enough and that's a fact. benchmarks aren;'t views, are facts, scientific measurements if you want.
the implications here is for instance rtx 4070/super, rtx 5070, rx 7700 xt, b570, b580 aren't great in present, not mentioning long term (at least 2-3years).
the're simply budget cards at big prices . check here prices for 8gb gddr6 https://www.dramexchange.com/ and see how much would cost to add 8gb gddr6
draw your own conclusions


none of those cards can max everything even if they had 100GB of VRAM, not sure what point you want to make. Want to max everything get a 5090 and even than you have to play at 1080p in some games to get a decent frame rate

some clickbait benchmarks are using idiotic setting just to sell views or who knows sell cards in exchange for some sponsor dollars.
 
I am currently playing a game that was released in October 2024 at 1440p on everything Ultra with DLSS @ Quality and typically use up to 9.8GB but I have RT off, simply because I tried it on and in this game (Veilguard) there was little to no improvement, if I had kept RT on I may well have run out of VRAM so I can see both sides of the story, however we don't all like and play the same games so for the individual as always, one size does not fit all. Should Mass Effect 4 (5?) actually get released in a couple of years and I find myself needing more VRAM to play it well then at that point I would upgrade whatever card I owned at the time.
 
Last edited:
I would rather have a GPU strong enough for its specs, rather than a weak GPU with a bunch of ram glued to it like it is actually going to help it.
 
pcgameshardware had a german article and a webvideo in german a few days ago

8GiB vs 16GiB on graphic cards recently in german

That topic is long obvious and clear. I realised that before the nvidia 3070.

For myself these are very old facts. Some will not understand it and argue.

Gamers nexus on youtube had videos about that vram topic months ago.

Regardless if it's german or not - I think those graphs can be read by anyone.
 
it can't

2077 motocycle drive in dogtown around obelisk/heavy hearts club
4060ti 16gb vram, 5700x3d, resolution 2k
capturing software: capframex+rtss
I just can't use 12gb vram cards anymore. time to get over it.
Many of us knew this was coming. Just a matter of time. That is the nature of tech and gaming.
As always, the choices are simple:
1. Buy a card with more VRAM.
2. Run the game in question with lower resolution/settings

There it is.
 
Lol, CP2077 probably manhandles that 4060Ti bad.

With no upscaling it will destroy mine, I play at 3840x2160.
 
I would rather have a GPU strong enough for its specs, rather than a weak GPU with a bunch of ram glued to it like it is actually going to help it.
100% this, I'd rather have a 3080 than a Maxwell Titan X. Or a 4070 than a Radeon 7 (although as a collector... Nice). You get the point. Appropriate VRAM is what you need. Is more better, sure! Can you purposely saturate some cards in edge cases? Why not!

It's pretty simple, if you think a card doesn't have enough VRAM for you, don't buy it. That doesn't mean everyone else that bought the card you didn't is dumber than you.
 
People tend to think that some things are mutually exclusive, while the truth is, they're not. All those are true at the same time:
NVIDIA sells vram as a higher tier feature BUT that doesn't mean 12gb isn't enough for 1440 max. I jumped from 3080 10GB to 4070 Super caused it was either that or installing AC in the house, couldn't take the heat of a 350W card. I saw 4070S use 11GB in scenarios where 3080 used 9GB and both ran smoothly. 4070TiS has shader count advantage over 4070S, and that's the main difference you see between then on tests. 4080S with 12G would be faster than 4070TiS 16G
 
It's pretty simple, if you think a card doesn't have enough VRAM for you, don't buy it. That doesn't mean everyone else that bought the card you didn't is dumber than you.
Honestly, a kid playing something he likes on medium-high settings on his “shit” 4060 with “dogshit” 8Gb is probably happier than an autistic “enthusiast” who is glued to a frametime graph 24/7 and is busy calculating how every possible card except for a 9999$ Halo tier flagship provides unplayable experience since they cannot max out every setting.
 
pcgameshardware had a german article and a webvideo in german a few days ago

8GiB vs 16GiB on graphic cards recently in german

That topic is long obvious and clear. I realised that before the nvidia 3070.

For myself these are very old facts. Some will not understand it and argue.

Gamers nexus on youtube had videos about that vram topic months ago.

Regardless if it's german or not - I think those graphs can be read by anyone.
I surprise even 7600 class GPU have difference between 8GB vs 16GB, granted not that much and depends game by game.

This topic has been covered before, someone modded 3070 to 16GB VRAM and have improved performance as well

 
Honestly, a kid playing something he likes on medium-high settings on his “shit” 4060 with “dogshit” 8Gb is probably happier than an autistic “enthusiast” who is glued to a frametime graph 24/7 and is busy calculating how every possible card except for a 9999$ Halo tier flagship provides unplayable experience since they cannot max out every setting.
This! If you're hitting the VRAM wall, turn some settings down or off and enjoy the game.
 
Back
Top