• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

24 Gb video RAM is worth for gaming nowadays?

24 Gb video RAM is worth for gaming nowadays?

  • Yes

    Votes: 66 41.5%
  • No

    Votes: 93 58.5%

  • Total voters
    159
I mean the dude thinks the CPU is "rendering" frames and you agree with him. :roll:

Clearly I am amongst the brightest minds this forum has to offer.
So your problem is I didn't say prepared instead of rendering? That's what you got caught up on? Really now?
 
Right, right. Jensen called, he requires your expertise

Probably your fist grade teacher should be calling because you can't read.

It clearly says the CPU is preparing frames, not that it's rendering anything ahead, you doofus.
 
So your problem is I didn't say prepared instead of rendering? That's what you got caught up on? Really now?
When he loses arguments, which happens frequently, he starts getting pedantic, something I've noticed previously too.
Wow you're certainly trying hard to be pedantic. It's useful in a xx90 tier card yes, because it has the power to drive the resolutions where large framebuffers are needed as stated.

Yes... and you can do so with different density chips bud. A hypothetical card's bus would be saturated with four 1 GB chips or four 2 GB chips, the difference is cost and whether more memory would give a performance advantage, which isn't always "yes".

You could make the 3060ti for example a 16 GB card easily - just swap the 1 GB chips with 2 GB ones. It's still pointless as the GPU isn't powerful enough to push resolutions where 16 GB is needed. All doing that would achieve is driving up cost.
 
So your problem is I didn't say prepared instead of rendering? That's what you got caught up on? Really now?

I mean of course, if you mix up the most basic terms imaginable am I supposed to give you the benefit of the doubt that you know anything at all ?

He goes around every topic - being wrong in all of them - I don't know wtf is going on. I don't believe he is real for a second, he can't be wrong on absolutely everything.. He is just baiting us
Sounds like cope to me.
 
When he loses arguments, which happens frequently, he starts getting pedantic, something I've noticed previously too.
Yeah, you should have seen the 4090 thread -he used similar tactics. Utterly ridiculous
 
I mean of course, if you mix up the most basic terms imaginable am I supposed to give you the benefit of the doubt that you know anything at all ?
Man who's confidently, insistently been wrong for several pages, challenged by over five different other posters who provided sources, gets upset that someone used an *interchangable* term. :laugh:
 
I mean of course, if you mix up the most basic terms imaginable am I supposed to give you the benefit of the doubt that you know anything at all ?
I don't know anything at all. Does Jensen? It has an option right there in his control panel that limits your framerate (reduces GPU bottleneck) to reduce your input latency.

You confused input latency with frametimes and im supposed to give YOU the benefit of the doubt that you know anything at all? :D
 
He goes around every topic - being wrong in all of them - I don't know wtf is going on. I don't believe he is real for a second, he can't be wrong on absolutely everything.. He is just baiting us
Honestly, I'm not meaning to be insulting to ya, but this is gonna sound abrasive:
Have you been outside or watched mainstream media lately? People being wrong on (seemingly) everything w/ absolutely no 'conscious trolling' is super common.

To try and 'walk back towards the topic':
More VRAM is better, IMO.
But, there's caveats and seemingly-pedantic technical details that factor in; like memory compression engines and bus-width. (Not to mention my opinion is based off longevity in ownership, not pure performance.)
 
someone used an *interchangable* term.
It's not interchangable, unless you don't know what you are talking about.

Or unless you're playing Blender instead of a game.
 
It's not interchangable, unless you don't know what you are talking about.

Or unless you're playing Blender instead of a game.
Fascinating how the rest of us managed to comprehend him though isn't it.

Keep changing the subject once you've been proven wrong, as is your habit.

You're totally right, the CPU definitely didn't provide any frames.

1680887648092.png
 
Fascinating how the rest of us managed to comprehend him though isn't it.

Keep changing the subject once you've been proven wrong, as is your habit.

View attachment 290690
He is trying to now go the "I was wrong but so are you route" :roll:

He still refused to accept he was wrong. Priceless
 
Have you guys considered the possibility that Vya Domus is just trolling you?
 
You confused input latency with frametimes
No I didn't, utter nonsense. When did I ever say they're same or whatever else you're implying. Just point to it.

Fascinating how the rest of us managed to comprehend him though isn't it.
It's not a matter of comprehending or not, the guy doesn't know which words to use, a clear sign of being illiterate on the subject. I am not surprised you agree with him.
 
Have you guys considered the possibility that Vya Domus is just trolling you?
Yes.
It's pointless, he can't differentiate between the fact that frame time and input lag are associated, with the fact that input lag is influenced by many other metrics too.

We're arguing logic with someone who doesn't understand definitions, concepts or systems with more than two variables.


We need to stay on topic - VRAM in gaming. Don't feed the troll.
 
Have you guys considered the possibility that Vya Domus is just trolling you?
Oh boy, I wish that was the case.
 
Okay, I'll give it one last go:

Let's say you have frames 1, 2, 3, 4, 5 and 6 rendered at 120 FPS, but your monitor is only capable of 60 Hz.
  • With uncapped frames, you'll see
    • frame 1
    • halves of frames 2/3
    • frame 4
    • hales of frames 5/6
  • With traditional Vsync, you'll see frames 1, 2, 3, 4, 5 and 6 at 60 Hz which will feel mushy and extremely input laggy.
  • With Enhanced/Fast Sync, you'll see frames 1, 3, 5, 7, etc., but the rest will be rendered, too, which generates heat and consumes power.
  • With a 60 FPS cap, you'll see frames 1, 3, 5, 7, etc. at the exact time they need to be on screen, and the rest won't be rendered.
Are we clear now?
 
Last edited:
You confused input latency with frametimes and im supposed to give YOU the benefit of the doubt that you know anything at all? :D
No I didn't, utter nonsense. When did I ever say they're same or whatever else you're implying. Just point to it.


It's not a matter of comprehending or not, the guy doesn't know which words to use, a clear sign of being illiterate on the subject. I am not surprised you agree with him.
Whoops, obviously you should have deleted your old posts before taking this stance bud. Running out of ground to stand on, assuming you ever had any to begin with.
That's categorically not true, the higher the framerate the lower the average latency, so by caping your framerate you are most certainly not lowering your latency, that makes no sense.


And you are also increasing the average frametimes as a result, I don't know what use this could have to anyone, also I don't know where you guys got this idea that maxed out CPU/GPU = more even frametime or less latency, none of that is true.



If you can't be bothered to actually explain yourself and you just post a link I am going to assume that, as in many other things, you didn't have anything intelligent to say.

100 frames in a second = an average frametime of 10ms

120 frames in a second = an average frametime of 8ms

By capping the framerate there is no guarantee that you'll get more even frametimes but you will certainly get longer frametimes. Also Enhanced Sync and Fast Sync do not cap the framerate, the game still runs uncapped internally, you just don't see the tearing. And VRR obviously needs a capped framerate for it to work, nothing interesting there.
 
He is trying to now go the "I was wrong but so are you route" :roll:

He still refused to accept he was wrong. Priceless
Considering how guilty I've been of precisely that (without trolling, and with less persistence), in years past (and occasionally today, once in-retrospect)...
There's a chance for them to learn and grow :)
Until then, (y)our efforts will (largely) be waste.

Ignore them; and if you're conscientious, merely passively try and allude to their mistakenness in the constructive-conversation.
Directly trying to provide evidence that they're wrong doesn't help, and is just more opportunity for you to get torn apart over the smallest error(s) and mis-communications/perceptions.
 
Well, 3/5ths of the vote says 24 GB VRAM isn't worth it today for gaming and 2/5ths of the votes say it is worth it.

Of course it will be worth it at some point in the future but I just think it's overkill today.
 
  • With a 60 FPS cap, you'll see frames 1, 3, 5, 7, etc. at the exact time they need to be on screen.
You don't seem them at the exact time they need to be on screen, whatever than even means, you see them at time they are received by the monitor like in the uncapped case and you also get tearing like in the uncapped case. Unless you have a form synchronization on you always get this behavior no matter how many frames per second the game runs at because the synchronization rate does not match the cadence of frames being delivered. Otherwise there would be no point in vsync even existing.

In this situation not only you get comparable input lag with the vsync case but you also get tearing, so you stand to gain nothing. The only reason you might want to do this is if the game you're playing can't quite hit 60 and you don't want stutter. As a matter of fact some games, especially on consoles are doing this, whenever the framerate drops below 60 they switch vysinc off. That's why I said this hardly achieves anything in my initial comment, hopefully now you get it.

Whoops, obviously you should have deleted your old posts before taking this stance bud. Running out of ground to stand on, assuming you ever had any to begin with.

At first I thought you just refused to read but now I now you probably have some literacy problem as well, you are supposed to be proofreader as well, imagine that. There isn't a single place in those comments where I said that input lag and frame times are the same thing, just that lower frametimes naturally lead to lower input lag which is 100% correct.
 
Well, 3/5ths of the vote says 24 GB VRAM isn't worth it today for gaming and 2/5ths of the votes say it is worth it.

Of course it will be worth it at some point in the future but I just think it's overkill today.
To be fair: I voted no, because 24Gigabits of RAM is way too little today. :p

The entire thread lies on a shaky foundation, IMO.
A 1080p gamer has little immediate need for 24GB of VRAM, and the 4k and 8k crowd have more limiting performance than just VRAM pool.
 
Well, 3/5ths of the vote says 24 GB VRAM isn't worth it today for gaming and 2/5ths of the votes say it is worth it.

Of course it will be worth it at some point in the future but I just think it's overkill today.
I think that when Plague tale at 4k ultra uses around 4.5 to 5.5 GB Vram, the whole Vram discussion is pointless. For comparison, TLOU requires 9.5GB at 720p, and it looks much worse than Plague tale. So the only thing having more VRAM will achieve is games hogging more and more cause devs don't optimize crap. You won't get better visuals, youll just pay more for a card with more vram that also uses more power. Great, isn't it
 
Well, 3/5ths of the vote says 24 GB VRAM isn't worth it today for gaming and 2/5ths of the votes say it is worth it.

Of course it will be worth it at some point in the future but I just think it's overkill today.

It will obviously depend on your use case.

If all you do is play esports games, then no, it aint worth it.

But if you are playing AAA games, then you are gonna be glad you have 24gb vram instead of 12gb vram in the coming years - we already see it being the case in several titles.
 
You don't seem them at the exact time they need to be on screen, whatever than even means, you see them at time they are received by the monitor like in the uncapped case and you also get tearing like in the uncapped case. Unless you have a form synchronization on you always get this behavior no matter how many frames per second the game runs at because the synchronization rate does not match the cadence of frames being delivered.

In this situation not only you get comparable input lag with the vsync case but you also get tearing, so you stand to gain nothing. The only reason you might want to do this is if the game you're playing can't quite hit 60 and you don't want stutter. As a matter of fact some games, especially on consoles are doing this, whenever the framerate drops below 60 they switch vysinc off. hat's why I said this hardly achieves anything in my initial comment, hopefully now you get it.



At first I thought you just refused to read but know I now you probably have some literacy problem as well, you are supposed to be proofreader as well, imagine that. There isn't a single place in those comments where I said that input lag and frame times are the same thing, just that lower frametimes naturally lead to lower input lag which is 100% correct.
See,
that,
form of,
on,
case,
that's
frame times

Hopefully that clears up my proofreading skills, since you seem to be calling them into question :).

First skim read is free, I'll bill you next time.

It will obviously depend on your use case.

If all you do is play esports games, then no, it aint worth it.

But if you are playing AAA games, then you are gonna be glad you have 24gb vram instead of 12gb vram in the coming years - we already see it being the case in several titles.
Maybe, it's not the reason why you'd buy a 4090/XTX though, you do that for the power of those cards, not the size of their frame buffers, there's several exceptions to this, such as if your work involves 3D editing or video production.

4K textures will remain at similar sizes, and unless future games start offering higher resolution textures for 4-8K displays, I don't really see this changing. What could change is developers getting lazier and releasing more unoptimized games.
 
Back
Top