• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
this is only war thunder.....

using 3 DLC HD textures, movie graphic setting, 2160p......

VRAM usage...

warplanes battles = 13gb....
ground forces = 16gb....
warships battles = 18gb...

plz, any one, can play those.... the old MMORPG F2P.....with rtx 4070 ti 12gb or rtx 4080 16gb for playing 2-3 hours long time, with the same setting of graphic & resolution..... ??
Oh that would be mighty interesting.
 
I think people are over complicating this if a gpu cost $400-600 it should have at least 12GB of Vram if the gpu cost $7-1200 it should come with at least 16GB and anything more expensive than that should come with 24GB in 2023.

Us as gamers should be voting with our wallets and expect this as a min.

I'm not going to sit here and say what people should have purchased but anyone who decided to grab a 6700XT/6800 over a 3070/3070ti will likely have a much better gaming experience in general while spending around the same sure the nvidia options have superior RT and I much prefer DLSS but does that matter if they don't have enough vram to actually run the latest games with ultra textures etc. I know what the nvidia fanboys will say just lower settings but to me that ridiculous considering both gpus cost well over 500 usd just one generation ago just feels bad.

Although I guess you can just say nvidia loyalist me included deserve this they've been gimping vram since at least the 600 series probably longer.

Considering how much nvidia seems to want to push the market forward with new/improved technologies it confuses me why they are so stingy with vram other than making people upgrade sooner than they'd likely have to otherwise if the gpus had adequate vram.

If the 3070ti came with 16GB and the 3080 had 20GB both those gpus would have been viable much longer and even in the edge cases where the game might not be super optimized it would just brute force and even if nvidia wanted more money for said models they could have at least gave aib a choice to sell slightly more expensive models.
Years ago, I remember my 770 2GB only went prematurely obsolete due to the lack of VRAM. Similar to all from that 700 series lineup, especially sad with the 780ti 3GB. I don't make that mistake again.
 
Before the recent onslaught of VRAM hungry games, i'd have said 24GB is overkill.

Now with Hogwarts Legacy, Forspoken and most recently The Last of Us PC Port, 24GB is definitely recommended if you're already dropping over 1K USD into a GPU.

16 GB should be the minimum you shoot for if you want a card to last 3+ years. Now that developers are unshackled by last gen consoles (PS4/XBoner), they will go balls to the walls with the VRAM allocation when porting to PC.
 
Last edited:
Considering how much nvidia seems to want to push the market forward with new/improved technologies it confuses me why they are so stingy with vram other than making people upgrade sooner than they'd likely have to otherwise if the gpus had adequate vram.
Because new improved technologies make old GPUs obsolete, and large VRAM don't?

In their marketing Nvidia always compares to their old generation. They are the market leader, they aren't trying to steal users from AMD , that's a small pool, a bigger pool of users is just to make old customers upgrade. It's what Apple is doing.
 
I have more fun watching those laptop gamers spending 3K$ on their devices that are instantly rendered EOL. Those people never cease to amaze me either way.
 
Oh that would be mighty interesting.

yup, today, some pc gamers only, trend to focus only hogwart legacy, forspoken RE 4 remake, TLOU part 1..... but, forget that war thunder its very hungry of vram usage.....

so, yes..... VRAM 24gb today and tomorrow, is not overkill for pc games..... especially pc gamer that using 2160p for gaming on PC.....
 
There are a few games that i can't play at 8k due to maxing out the 24gb vram on my 4090.
 
That's rough alright.

Indeed. The 4090 was marketed as an 8k gaming gpu, and while everyone ofc knows it's bs for the most part, it can play most at 8k with dlss. However, there are starting to come some games where the 24gb vram just aint enough, even when using dlss ultra performance - like here in witcher 3 remaster. The frametime graph (and fps) is looking less than optimal due to the heavy vram swapping.

q14Vikg.jpg


As a note - the fps would have been fine had the vram swapping not been going on.
 
Last edited:
Sorry, no. Better get something with 64GB of VRAM.

You will need it to play the upcoming port of Last of US part 2 to make it playable beyond potato settings.
 
Nope, gimme a 1TB GPU, or gimme death, like, yesterday......hehehehe :)

this reminds me, I used to think a 500hz monitor would change my life, but it turns out I actually don't like anything above 165hz, because it starts to develop a soap opera effect at around 240hz 240 fps. lol
 
Gamingwise, 24 gigs is too much for so slow GPUs we got nowadays.
3d render wise, 24 gigs is too little for so heavy textures and models we render nowadays.

So I consider 20+ GB gaming cards something worst from both worlds.

For gaming, you'd rather have a faster GPU (which outperforms 4090) capped by 16 gigs of VRAM than the 24-gig watty monsters with relatively sluggish cores by the names of RX 7900 XTX and RTX 4090.

Just go and see how 10-gig 3080 makes fun of 11-gig 2080 Ti.
Just go and see how 12-gig RTX 3060 struggles to compete with 8-gig 6600 XT.

More VRAM is more flexibility and less heat since less percent VRAM is used thus giving more room for the GPU to boost, alas the current tech allows for too slow GPUs to use 24 gig potential at full.
 
Gamingwise, 24 gigs is too much for so slow GPUs we got nowadays.

For gaming, you'd rather have a faster GPU (which outperforms 4090) capped by 16 gigs of VRAM than the 24-gig watty monsters with relatively sluggish cores by the names of RX 7900 XTX and RTX 4090.

No it aint, and no i wouldn't.
 
@Dragam1337
The system RAM and VRAM usage for Witcher at 8k is....amazing. Would 2xMSAA at 4k require less VRAM and GPU load?
 
Indeed. The 4090 was marketed as an 8k gaming gpu, and while everyone ofc knows it's bs for the most part, it can play most at 8k with dlss. However, there are starting to come some games where the 24gb vram just aint enough, even when using dlss ultra performance - like here in witcher 3 remaster. The frametime graph (and fps) is looking less than optimal due to the heavy vram swapping.

q14Vikg.jpg


As a note - the fps would have been fine had the vram swapping not been going on.
I'm guessing you didn't get the sarcasm.

To put it bluntly, your problem is kind of unique to you. 99% of people have PC setups that cost less than your 8K screen alone. I suppose you should sell that 4090 as soon as possible and get something better... oh wait... there isn't anything better... then, I know it hurts, but maybe you could try playing at mortal human settings?
 
Newest videocard I can find with 24Gb (3072MB) VRAM is the RX 5300 3 GB....
UVbitingpeargraphic.jpg

kilobyte.png

Gb must be nVidia GigaBytes: like Drivemakers' kilobytes, they shrink after purchase (GTX 970)
 
Last edited:
I'm guessing you didn't get the sarcasm.

To put it bluntly, your problem is kind of unique to you. 99% of people have PC setups that cost less than your 8K screen alone. I suppose you should sell that 4090 as soon as possible and get something better... oh wait... there isn't anything better... then, I know it hurts, but maybe you could try playing at mortal human settings?

Janteloven is strong with you.

@Dragam1337
The system RAM and VRAM usage for Witcher at 8k is....amazing. Would 2xMSAA at 4k require less VRAM and GPU load?

Can't use msaa in witcher 3 remaster - but if you could, it would have used substantially less resources, yes :)

Most games are just on the upper border of vram when using 8k + dlss

frRbEqw.jpg


9W4U7iJ.jpg


rfNmam2.jpg

But some are more modest in their vram requirements

GP5Rvkz.jpg


Iw6I5ui.jpg
 
Last edited:
Janteloven is strong with you.
You're still not getting it. I don't envy your 4090 and 8K screen, nor do I think you shouldn't have one. What I think is that you should lower your expectations to a reasonable level and appreciate what you have instead of complaining about a non-issue. When your new Lamborghini does 0-60 in 2.9 seconds instead of 2.8, it's not the end of the world, you know, and I won't apologise for not feeling sorry for you.

Seriously, though. The Steam survey doesn't even list 8K as a primary screen resolution, and even 4K has an adoption rate of 1.78%. How many people do you think share or understand your "problem"?
 
@Dragam1337
That Spiderman screencap at 8K is amazing. Does the Spiderman title take more system resources than Cyberpunk 2077? I guess at 8k who needs AA?
 
Nowadays minimum 32gb and that is what I have at moment however my next system will be minimum 64gb. In many cases at moment I run out of ram.
 
@Dragam1337
That Spiderman screencap at 8K is amazing. Does the Spiderman title take more system resources than Cyberpunk 2077? I guess at 8k who needs AA?

I'm not entirely sure what metric you mean when you say system resources - if you mean ram, then yes, but if you mean gpu load, then no :)

83vghSR.jpg


Btw AA is still very much needed at 8k - i reckon it will always be needed. You need some kind of temporal stabilizer, like taa or dlss.
 
As for now, I feel that 8GB is more or less a minimum for modern AAA Games at this point. However with games like Hogwarts Legacy and TLOU part 1 Require 8 GB or More to fulfill the "Recommended" Spec.
 
No it aint, and no i wouldn't.
Excellent delivery and perfection in proving!

Just check if any game consumes 20+ GB whilst staying playable (1% low is strictly more than 30 and avg is strictly more than 45 FPS) with 4090.
...there's no such game.
 
The new changes to DirectX have some news sites talking about gaming PC's following the current consoles, and moving more and more into VRAM and using less system RAM
Low end systems wont be affected all, since they use system ram to top off the VRAM anyway - but it takes some strain off the CPU, the newer way


The argument with no answer, was comparing RAM and VRAM for bandwidth and latency and the balance between them, pros and cons of a system with only one type (like the consoles) and how it's theretically possible for a PC with almost no RAM, if it could boot use the VRAM like consoles do - we already have IGP's with no VRAM that use the system RAM, so we've got real world examples of both methods being done.

There's a never-ending push to move more and more processing over to the GPU and now to VRAM, and It's curious to think where it will stop
 
Back
Top