• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
See,
that,
form of,
on,
case,
that's
frame times

Hopefully that clears up my proofreading skills, since you seem to be calling them into question :).

First skim read is free, I'll bill you next time.

Thank God the autocorrect works otherwise you'd be dead in the water.

Still couldn't point to what I asked for though, hmm, curios.
 
I think that when Plague tale at 4k ultra uses around 4.5 to 5.5 GB Vram, the whole Vram discussion is pointless. For comparison, TLOU requires 9.5GB at 720p, and it looks much worse than Plague tale. So the only thing having more VRAM will achieve is games hogging more and more cause devs don't optimize crap. You won't get better visuals, youll just pay more for a card with more vram that also uses more power. Great, isn't it
Another thing to contend with is how AA/AF and other 'driver and/or post-process' effects factor in. Even better, it's not as simple as 'moar VRAM = less 'costly' AA'.

Beyond AA, etc., some games aren't even reliable in how much VRAM they use.
Example: UE4-based MechWarrior 5 would not come close to eating up 8GB VRAM on my RX580s, managed to (mostly) play smoother on my 4GB 6500XT, *AND* MW5 now regularly uses more than 8-10GB on my Vega (10) 16GB HBM2 card.
(I'm having flashbacks to the oldmeme: "YouCan'tExplainThat!")
 
Thank God the autocorrect works otherwise you'd be dead in the water.

Still couldn't point to what I asked for though, hmm, curios.
Curious.
 
See,
that,
form of,
on,
case,
that's
frame times

Hopefully that clears up my proofreading skills, since you seem to be calling them into question :).

First skim read is free, I'll bill you next time.


Maybe, it's not the reason why you'd buy a 4090/XTX though, you do that for the power of those cards, not the size of their frame buffers, there's several exceptions to this, such as if your work involves 3D editing or video production.

4K textures will remain at similar sizes, and unless future games start offering higher resolution textures for 4-8K displays, I don't really see this changing. What could change is developers getting lazier and releasing more unoptimized games.

Well, i like playing at 8k, and a 4080 could handle a good deal of games at 8k with dlss in regards to processing power... but it wouldn't have enough vram for the vast majority of them :)
 
Another thing to contend with is how AA/AF and other 'driver and/or post-process' effects factor in. Even better, it's not as simple as 'moar VRAM = less 'costly' AA'.

Beyond AA, etc., some games aren't even reliable in how much VRAM they use.
Example: UE4-based MechWarrior 5 would not come close to eating up 8GB VRAM on my RX580s, managed to (mostly) play smoother on my 4GB 6500XT, *AND* MW5 now regularly uses more than 8-10GB on my Vega (10) 16GB HBM2 card.
(I'm having flashbacks to the oldmeme: "YouCan'tExplainThat!")
Yep, it's more of a implementation thing and a resolution issue than a simple "newer games need more VRAM".
 
Well, i like playing at 8k, and a 4080 could handle a good deal of games at 8k with dlss in regards to processing power... but it wouldn't have enough vram for the vast majority of them :)
Sure, but the amount of people who can afford 8K, which is 4x 4K, not two 4K monitors, to be clear, is small, and for those who choose to do so, I don't think they'd be budget limited to the 4080 over the 4090. And again, this is a resolution question, not a "worth it for gaming" question.
 
He never gives up, even when exposed to the whole forum. I envy his commitment. Seriously, bravo
He can't, because then his internal narrative collapses.
 
You don't seem them at the exact time they need to be on screen, whatever than even means, you see them at time they are received by the monitor like in the uncapped case and you also get tearing like in the uncapped case. Unless you have a form synchronization on you always get this behavior no matter how many frames per second the game runs at because the synchronization rate does not match the cadence of frames being delivered.
You're right about Enhanced Sync and Fast sync, that's why I said combined with an FPS limit! When you have Fast Sync enabled, your GPU calculates all the frames and discards the ones above your monitor's refresh rate to eliminate tearing. An FPS limit will force the GPU not even to calculate those frames. So when you set your FPS limit to the monitor's refresh rate, and the game can actually achieve that, you will see butter smooth, even frame times and no tearing with no unnecessary heat introduced to your PC. I don't know what more you could want.

Next:
In this situation not only you get comparable input lag with the vsync case but you also get tearing, so you stand to gain nothing. The only reason you might want to do this is if the game you're playing can't quite hit 60 and you don't want stutter. As a matter of fact some games, especially on consoles are doing this, whenever the framerate drops below 60 they switch vysinc off. That's why I said this hardly achieves anything in my initial comment, hopefully now you get it.
Vsync calculates every frame and synchronizes them to your screen refresh rate by queuing (or "waiting in line" if you speak American), which is what gives you bad input lag.
A frame rate cap doesn't even render the frames that are above your screen's refresh rate. No queuing = no added input lag. The only input lag you might feel like you're "introducing" is the torn frames in the uncapped situation that you lose, which is next to nothing.

How are these two comparable to you?

If you still think a frame rate cap is comparable to Vsync, honestly, try them out. One after the other. There's nothing more I can say at this point.
 
He can't, because then his internal narrative collapses.
Hard to give up when you are right.

And people make it harder when I ask for proof of something I said and they can't do it because there is none.

Now that's truly laughable.
 
Ignorance truly is bliss.

I mean, if you don't acknowledge facts, you can certainly pretend you're right! What a revolutionary doctrine!

When 8K gaming is normal, maybe 24 GB VRAM will be needed, good luck with a years old GPU though at that point.

Until then halo products will remain relevant to creators for their hardware resources, and gamers who like pushing resolution/FPS.
 
I'm kinda shocked this thread hasn't been locked......
 
Another thing to contend with is how AA/AF and other 'driver and/or post-process' effects factor in. Even better, it's not as simple as 'moar VRAM = less 'costly' AA'.

Beyond AA, etc., some games aren't even reliable in how much VRAM they use.
Example: UE4-based MechWarrior 5 would not come close to eating up 8GB VRAM on my RX580s, managed to (mostly) play smoother on my 4GB 6500XT, *AND* MW5 now regularly uses more than 8-10GB on my Vega (10) 16GB HBM2 card.
(I'm having flashbacks to the oldmeme: "YouCan'tExplainThat!")
I suppose games use different amounts on different cards, just like Windows reserves different amounts of system RAM on PCs with different amounts available.

I mean, I currently have 6.3 GB occupied with 6 Chrome tabs, Task Manager and GPU-Z open, Steam and GOG running in the background. I wouldn't see this much on my HTPC with 16 GB RAM in it.
 
When 8K gaming is normal, maybe 24 GB VRAM will be needed, good luck with a years old GPU though at that point.

I don't think 8K gaming ever will be normal. I mean look at all the years that 4K has been out and according to the Steam Hardware Survey the adoption rate is still below 2%.

It's not the cost of the 4K monitors that's holding it back. It's the cost of the GPU to run it.

Then you look at 8K and that's as many pixels as four 4K monitors to process.
 
I don't think 8K gaming ever will be normal. I mean look at all the years that 4K has been out and according to the Steam Hardware Survey the adoption rate is still below 2%.

It's not the cost of the 4K monitors that's holding it back. It's the cost of the GPU to run it.

Then you look at 8K and that's as many pixels as four 4K monitors to process.
Yep, it's because the increase in pixels is exponential.

It will definitely be normal at some point, progress marches on.

4K was mainly released for creators and professionals, for gaming it never really took off because it has high costs associated with it, plus developers don't always release games that can really take advantage of higher pixel counts, leaving modders or community projects to create higher resolution texture packs or things like ultrawide support. Additionally, current display cable limitations make you pick between 1440p/240/360 Hz or 4K/120 Hz, which isn't a compromise many want to make. IMO OLED 1440p/240 Hz is the current top tier, and the panels in development that are OLED 4K/240 Hz will be the next top tier.

Unfortunately game development is, like many things, held back by the lowest common denominator, which in this case is consoles. Consoles still have to pick between full resolution and "high" frame rate e.g. 120 FPS, even at 1080p.

I don't expect adoption numbers for 4K/8K will change until some marketing exec successfully upsells next gen consoles on "8K" capable, despite the fact it will almost certainly be upscaling. Until that point, 24 GB VRAM will primarily continue to be useful for things other than gaming.
 
I don't think 8K gaming ever will be normal. I mean look at all the years that 4K has been out and according to the Steam Hardware Survey the adoption rate is still below 2%.

It's not the cost of the 4K monitors that's holding it back. It's the cost of the GPU to run it.

Then you look at 8K and that's as many pixels as four 4K monitors to process.

Yeah, I agree.

Although there are more cards today that can give a satisfactory 4k experience than ever especially with DLSS/FSR the price of entry seems to be getting higher and higher, The 4070ti for example while decent for 4k gaming loses a lot of steam vs it's 1440p performance and even the 7900XT I personally wouldn't use for 4k gaming and that released at 900 usd.

I switched to 4k gaming back in 2014 almost a decade ago but I still enjoy 1440p high refresh which I feel for most is more balanced visually/cost wise.
 
No queuing = no added input lag.
No "added" input lag, key word here is "added", game with vsync at 60fps is typically triple buffered that means 2 frames worth of delay so roughly 30ms.

Say you cap at 60 with no vsync, still, every fame costs 16ms, so the input lag is at the very least 16ms.

Say you don't cap and the game runs at 120 average, that's 8 ms per frame at the very least now. This isn't entirely accurate because a typical rendering pipeline has more frames rendered ahead of times but you get the point (hopefully), the less time you spend on a frame the lower the input lag.

Again, if you cared about input lag, you'd go for uncapped, that's not a matter of opinion it's just numbers. Why you insist to believe that getting tearing (which by the way is less noticeable at higher framerate aka uncapped) and not getting the lowest input lag possible as well is the superior choice is beyond me, because it's objectively the worst, you get both suboptimal input lag as well as visual defects. But each to their own, for the record I never run uncapped, I do not care for input lag, it's always vsync or VRR for me.

Ignorance truly is bliss.

I mean, if you don't acknowledge facts, you can certainly pretend you're right! What a revolutionary doctrine!

Yeah, yeah, spare me the boilerplate, onto more important matters, where's the proof of something which I never said ?
 
@aus and ferret,

I don't buy on a whim, and I rarely buy full price, so please, don't try to put me in a box just because I didn't listen to the haters. I have said COUNTLESS times that this game is an exception with the lifted 2 Hr refund limit. I just didn't see any reason to refund it, and I still don't after well over 2 play throughs now. In fact I'm only liking it more on Survivor mode, as it ramps up the challenge with no see through walls "Hearing" feature, and half as many available resources to scavenge. But hey, if you're only into games for the atmosphere and exploration, I can see why you'd only value it at 10 GBP. I happen to want MUCH more than that out of my games, so I get why you don't understand my rationale. :rolleyes:

Furthermore, if you guys were 65 like me, with ever fading memory, eyesight, reflexes, and dexterity, maybe you'd get that waiting a long time for price drops makes me wonder if I'll still be able to play with the challenges I put on myself when I game by the time it hits YOUR idea of an acceptable price. I am by no means a casual gamer, but I know playing on the harder modes will one day become quite the chore. I also only play select genres of games, so I don't buy nearly as many as some do, which makes near full price or even at full price shopping once in a while, very manageable for me.

It seems like you guys judge players the same way a lot of the skeptics do games, without really knowing them, and it's clear you guys don't get how business works. You can say all you want that boycotting or waiting for bargain bin prices will improve game quality, but the practical man knows it's the early months of a release at or near full price that keeps developers stable enough to even AFFORD to put out good games. They'd all go broke and stop making games if every player listened to your way of thinking.
 
Last edited:
Sure, but the amount of people who can afford 8K, which is 4x 4K, not two 4K monitors, to be clear, is small, and for those who choose to do so, I don't think they'd be budget limited to the 4080 over the 4090. And again, this is a resolution question, not a "worth it for gaming" question.

Well, what is worth it for gaming does depend on the resolution you play at, aswell as the games you play :)'

But i will say this much - regardless of resolution, i wouldn't personally buy a gpu for AAA games with any less than 16gb in this day and age. We are already seeing several games where 12gb will be right on the border even at 1080p.
 
Yeah and if the original titan did have 24 GB VRAM it would still be borderline useless today, because the VRAM is only useful if the card is powerful enough to use it.

The point of this thread is that there's obvious scenarios where cards have more VRAM than they need or can use. For anything below halo cards today, 24 GB VRAM is pointless.

I'd take a 4080 over a 3090 any day, because while it has 8 GB less VRAM and is therefore bad according to many in this thread, it's going to deliver a much better experience.

That wasn't my point, I actually agree with you on that, it's something that belongs in the halo level at this point in time. What I'm saying is that it's not a negative point for a GPU to be well supplied with memory. 16, 24... both are well supplied.
 
Well, what is worth it for gaming does depend on the resolution you play at, aswell as the games you play :)'

But i will say this much - regardless of resolution, i wouldn't personally buy a gpu for AAA games with any less than 16gb in this day and age. We are already seeing several games where 12gb will be right on the border even at 1080p.

What's crazy and a little sad at the same time is the 4060/4060ti will both likely come with 8GB of vram which is just pathetic..... Even entry level cards had 8GB like 7 years ago smh.

As much as I don't like the 7900XT/7900XTX at least they come with adequate Vram.
 
IMO, ideally, you'd always want to have a GPU bottleneck on your system, as it provides lower average framerates (whether "lower" means 60 instead of 80, or 200 instead of 250), while a RAM or VRAM bottleneck means stutters or freezes during asset loading, and a CPU bottleneck usually means random stutters at random points, both of which are infinitely more annoying than just a generally lower than "x" FPS.
I wish my GPU limited scenario in Metro Exodus with my 1080ti let me get 60 FPS but it drops to the low 40's -- which is noticeable even with Gsync. :(
 
I wish my GPU limited scenario in Metro Exodus with my 1080ti let me get 60 FPS but it drops to the low 40's -- which is noticeable even with Gsync. :(

40-60fps feels pretty terrible on most monitors Gsync/Freesync isn't going to fix that.
 
I wish my GPU limited scenario in Metro Exodus with my 1080ti let me get 60 FPS but it drops to the low 40's -- which is noticeable even with Gsync. :(

Wasn't Metro Exodus performance mostly a problem with your OC?
 
Back
Top