• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
do u know(u dont hence my comment hence yours) the top gpu in 2022 was the 6 gb 2016 1060... with many many cards on the list on steam being lower vram... the currnet consoles are 16... for system memory not just gpu meory but entire pc.. u know the most popular device in the world the switch has 4 bgs of system and v ram total.... 4...so 256 for just v ram... wow maybe in 2080

No, actually i didnt know the top GPU was a 1060, thats probably mostly due to the price.
Incase YOU didnt know, i 100% agreed that 24Gb is not necessary.

As for 256Gb being useful SOME day, come back in 20 Years, you might be surprised.
 
that will NEVER be neccasry for v ram...

Who would ever need more than 640 K of memory?

Wild tangent aside, GPU VRAM amounts had stalled due to the old hardware generations. Games designed for the new consoles will drive system requirements up on PC as well.

Time to get acquainted with the idea to retire Windows 7, that first generation i7, get more than 16 GB of RAM and accept ray tracing is here to stay... ;)
 
a 24GB card from today would be too hopelessly lacking in raw throughput to be worried about RAM.
This is how I tend to view it, generous VRAM is nice to have for sure, nobody would argue that extra isn't nice to have, but the necessity of it is closely linked to the compute ability and feature set of the gpu.

Take a Maxwell titan, basically a 780ti with 12gb of VRAM, would anyone expect that to be better than an RTX 3080, hell even a 3070? Filling that VRAM with textures is helpful if your gpu can push the frames. I'd take a 3070 hands down over a Maxwell titan for gaming.
 
a pc does more things than gaming and 24gb is a necessity, but narrow the focus to gaming only i would say 16gb is ok for now.
the problem is that developers hold back memory usage and we get graphics artifacts and texture loading that could have been avoided.
you developers are too lazy to put in sliders and give us the option to use more vram. so we all get the low usage, the lowest common denominator.
 
I voted Yes because history of needs for VRAM has changed from megabytes to gigabytes since I started building PCs
 
24GB is a bit much even some games depending on the resolution requires a set amount and it made fun of the RTX 3080 10GB when the game launched because it required 11GB.

I don't really see why the x080 card don't get at least 16GB it's not like the memory cost an arm and a leg to put on it's just Nvidia that wants you to buy their higher model cards so they don't do a Pascal move with the GTX 1080 Ti 11GB that beat the Titan X Pascal 12GB so the only way it sold was if you needed the extra 1GB vram and then they discontinued it and release the Titan XP 12GB which was the fully unlocked gpu if I remember correct.
 
I think if 24gb cards become mainstream and affordable, games can have higher res textures. Because game companies most likely have high res textures but they're downsampling for release.

And then you'd have games with over 100gb installation size.

It doesn't hurt if there's more memory. But for now 24gb is overkill for games.
Just look around in any game, you'll find twin assets everywhere. Walk 4 meters ahead and tiles on the ground repeat in every game.
 
it made fun of the RTX 3080 10GB when the game launched because it required 11GB
What game was that, and was it allocated but not used? ie perhaps it called for 11gb, but did a 3080 10gb game just fine or was it a texture streaming stutterfest? Quite curious as I've yet to experience a stutterfest with a 3080.

My recollection is, RDNA2 had few strengths VS Ampere, one was VRAM on higher end models, so AMD sponsored games made absolutely sure to fill that and go as light as possible on anything an RTX card would do well, after all, who could fault that approach, it was essentially the only play AMD had in their book at the time, and so I wouldn't be surprised to find it it was an AMD sponsored title from 2020-2022.
 
Last edited:
be aware the higher the Vram. Ram. Storage Space. the more uncompressed. fullsize images they can use. giving a beter quality. and cpu doest need to uncompress the data before it gets moved to its desired location
 
What game was that, and was it allocated but not used? ie perhaps it called for 11gb, but did a 3080 10gb game just fine or was it a texture streaming stutterfest? Quite curious as I've yet to experience a stutterfest with a 3080.

My recollection is, RDNA2 had few strengths VS Ampere, one was VRAM on higher end models, so AMD sponsored games made absolutely sure to fill that and go as light as possible on anything an RTX card would do well, after all, who could fault that approach, it was essentially the only play AMD had in their book at the time, and so I wouldn't be surprised to find it it was an AMD sponsored title from 2020-2022.

Watch dogs Legion in 4K is one of them
1675019277065.png


But there is another game I cannot recall what it's called.
 
24GB is useful for some compute workloads, but is pretty overkill for gaming.

what some call Overkill is what others think of as future proof.
 
what some call Overkill is what others think of as future proof.
Gpu future proofing is not a good idea. PSU may be.
But it's better to upgrade more frequently than to buy this much extra vram. Will cost less imo
Until your additional vram becomes useful, a much better gpu with lower price/watt will hit the market.
 
Watch dogs Legion in 4K is one of them
Been a while since I played it, and I doubt I just ran 4K Ultra everything but I wouldhave used the highest texture setting for sure, don't remember having issues. 11GB there seems more because the recommended card is an RTX2080Ti which has 11GB.
 
24 GB is about the right size to target ultra quality 4K with raytracing and the general requirements of applications on high resolution monitors. If it wasn't needed, these GPUs wouldn't have it.
24GB was marketed as creator territory, not gaming, even by Nvidia itself, plus the rest of their Ampere stack makes do with half or even less than it. Story doesn't check out. Even Ada doesn't carry much more.

12-16GB seems to be fine right now. At 8K... I'm sure you can cripple a card through VRAM somehow. Enjoy, you've just created a problem looking for expensive solutions.
 
Real-time color holographic reconstruction (2020)81Point cloud based13×13×GTX TITAN X (8000)1920×10801920×1080 RGB + alpha coloured at 38.31 FPS
Soon will be for masses and yes more and faster VRAM be appreciated.
 
I understand 24GB for work usage, but IMO totally overkill for pure gaming. I have no problems gaming with 12GB at 4K.
 
Some games allocate a lot of memory/vram, but don't use all of it.

It gives a false sense of my game needs/uses a lot of vram.
 
It will be useful and will have an impact after DirectStorage become a normality.
 
24GB was marketed as creator territory, not gaming, even by Nvidia itself, plus the rest of their Ampere stack makes do with half or even less than it. Story doesn't check out. Even Ada doesn't carry much more.

12-16GB seems to be fine right now. At 8K... I'm sure you can cripple a card through VRAM somehow. Enjoy, you've just created a problem looking for expensive solutions.

I'm not sure, if you observe the current generation of GPUs (including AMD's), the trend has been to increase memory capacity to better accommodate modern technologies. High-end raytracing and DirectStorage should both increase memory consumption, accordingly, Ada's memory capacity improvements over Ampere are massive on the higher performance segments, except for the flagship which was already well supplied:

3070 Ti (8 GB) > 4070 Ti (12 GB) (50% increase)
3080 (10 GB) > 4080 (16 GB) (62.5% increase)
3090 (24 GB)> 4090 (24 GB) (no change)

You'll observe the same phenomenon on the red team:

RX 6800 XT (16 GB) > RX 7900 XT (20 GB) (20% increase)
RX 6900 XT (16 GB) > RX 7900 XTX (24 GB) (33.3% increase)

A creator-marketed GPU in 2023 would potentially have even more VRAM: the Titan Ada is rumored to be a 48 GB beast.
 
Maybe you should read my whole post. There is no "future proofing" if the GPU core's capabilities will be long superseded by the time games actually start utilizing 24GB of VRAM.
That old notion does not apply with newer cards with Rebar & direct storage capiablies now.
 
Last edited:
what some call Overkill is what others think of as future proof.
tech future proofing is what fan boys use when they don't have facts to justify their opinion

if you truly want to future proof tech, go buy it in the future
 
1675193022073.png


Although the increase of cuda cores is not that impressive, the opposite is for the amount of L2 cache.

Every time I see the table specs i say, if only there were a 4080Ti with 13K cuda cores at 1200 and the 4080 at 900.
 
I'm not sure, if you observe the current generation of GPUs (including AMD's), the trend has been to increase memory capacity to better accommodate modern technologies. High-end raytracing and DirectStorage should both increase memory consumption, accordingly, Ada's memory capacity improvements over Ampere are massive on the higher performance segments, except for the flagship which was already well supplied:

3070 Ti (8 GB) > 4070 Ti (12 GB) (50% increase)
3080 (10 GB) > 4080 (16 GB) (62.5% increase)
3090 (24 GB)> 4090 (24 GB) (no change)

You'll observe the same phenomenon on the red team:

RX 6800 XT (16 GB) > RX 7900 XT (20 GB) (20% increase)
RX 6900 XT (16 GB) > RX 7900 XTX (24 GB) (33.3% increase)

A creator-marketed GPU in 2023 would potentially have even more VRAM: the Titan Ada is rumored to be a 48 GB beast.
Yes, cue 2017, 1080ti with 11GB. Turing: 11GB,but less if you went to an x80.

Basically Nvidia had to catch up because they had devolved into absolute shit capacities, other than that, 16 > 24 seems excessive between one generation and we can see Nvidia still makes do with less until you go into their Titan territory. I think with 16GB you're at a level right now where you're just fine for 4K. Maybe at some point you might meet the odd title that wants more, but honestly, there's also a territory of just utterly crappy optimization/dev work. That's not a norm and it won't help you accomodating that extreme.

But maybe you're right, we're gonna have to see what time brings us, still though, as long as consoles are stuck on a certain capacity, gaming isn't moving places beyond. 16GB though I see as essential for any half decent high end video card. The fact 4070ti already cuts down on that is telling given its perf parity with a 7900XT. I thought all the shiny new things needed VRAM :)
 
Back
Top