• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

From AMDs perspective its probably why not given how cheap VRAM is right now wholesale. Given how hungry modern games are in VRAM I would be doing the same. The nice thing about high texture resolution is its practically free on performance, it needs VRAM but doesnt really need rendering performance.

The most popular games are online competitive games where extra VRAM is useless. I play PUBG and it uses like 5GB @4K competitive settings.
Using Silicon cost calculator, for an extra 60usd AMD could make a much bigger GPU that could be 30% faster than current rx7600, but the tape out cost for new chip is huge so...
 
Last edited:
The most popular games are online competitive games where extra VRAM is useless. I play PUBG and it uses like 5GB @4K competitive settings.
Using Silicon cost calculator, for an extra 60usd AMD could make a much bigger GPU that could be 30% faster than current rx7600, but the tape out cost for new chip is huge so...

Yeah those games are designed to be played at those crazy uber frame rates with less focus on visuals, but some of us play RPGs that are made to be pretty and low frame rate. AMD is catering for guys like me, its just lack of SGSSAA keeping me away.
 
I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.

In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.

Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"

This is something I can agree with cause its also what I'm personally experiencing with a 3060 Ti aka a lowly shitty 8 GB card according to some ppl around here. 'and yes I did pick this card over a 6700 XT..'
Ppl keep bringing up the 4k argument even tho its still not a common res to game at, maybe on forums/tech sites like TPU it is but in general nope and its also a moot point for me playing on a 2560x1080 monitor which has less pixels than a 1440p monitor.

In newer games I run out GPU power way before running out of Vram, especially with Unreal Engine 5 games and those are gonna be more and more common just like the UE 3-4 games before. 'sure the devs could use very high res textures in the future but the engine itself is very GPU heavy if nanite and lumen is used and thats gonna be the limiting factor first'
Immortals of Aveum was seriously choking my GPU on high settings even with DLSS but it had no Vram related issues, same with Plague Tale Requiem 'in house engine but still' which is an amazing looking game with great textures and it had zero Vram issues with RT off 'it did not even have RT when I was playing it at the relase' but the game was heavily limited by my GPU.

Older or lighter games are also a moot point cause those are easy to run anyway, even maxed out.

Sure theres always that edge case with shit optimized games at launch but even then its not a big deal to turn settings down a notch or wait for a fix like with Last of Us which is perfectly fine by now even on 8 GB cards at 1440p ~high settings so I'm all good whenever I will decide to play/buy it.

While I do agree this Vram issue is an existing thing but in my opinion its is way overblown with ppl always finding a reason for it or pushin their cards where it shoudn't even be performing at in the first place.

Personally I would have no problems buying a 12 GB card like a 4070/super if I could afford it cause I'm sure it would easily last me my usual ~3 years if not more. 'this console generation at least + I'm not planning on upgrading my resolution/monitor either'
 
Last edited:
For reference FF15 is pushing my VRAM to 9.5 gig utilisation (out of 10 gigs), I am playing at 4k rendering and, and the GPU itself is barely at 30% utilisation with reflections set to high (thats a GPU killer on FF15). The 4k texture pack is of course pushing the VRAM usage up.

We have to bear in mind the comments claiming that rendering resource are exhausted first are dependent on the game's you play and the settings used in those games. I cannot remember when I last played a game where my GPU was sustained over 50% utilisation, FF7 remake had moments, if I customised the engine settings to add particles etc. then it would occasionally on things like limit break activations, briefly hit the 90s.

The reason its blown up a lot is VRAM is not easy to upgrade, the GPU has to be replaced, if it was a modular system, where it could be expanded same way as DRAM then it wouldnt be such a big deal. Some of us are sensitive to substandard textures, pop in's and so forth.

Another ironic thing is Nvidia are pushing RT which in itself is a VRAM guzzler.

For me bottlenecks in anything recent I played (last 3 years) is probably in this order.

CPU -> VRAM -> GPU rasterization. Which is funny as the dominant opinion puts GPU first. However I dont play at high framerates so routinely my GPU is at low to medium utilisation. My CPU platform upgrade easily feels more impactful on my games than my last GPU upgrade.
 
For reference FF15 is pushing my VRAM to 9.5 gig utilisation (out of 10 gigs), I am playing at 4k rendering and, and the GPU itself is barely at 30% utilisation with reflections set to high (thats a GPU killer on FF15). The 4k texture pack is of course pushing the VRAM usage up.

We have to bear in mind the comments claiming that rendering resource are exhausted first are dependent on the game's you play and the settings used in those games. I cannot remember when I last played a game where my GPU was sustained over 50% utilisation, FF7 remake had moments, if I customised the engine settings to add particles etc. then it would occasionally on things like limit break activations, briefly hit the 90s.

The reason its blown up a lot is VRAM is not easy to upgrade, the GPU has to be replaced, if it was a modular system, where it could be expanded same way as DRAM then it wouldnt be such a big deal. Some of us are sensitive to substandard textures, pop in's and so forth.

Another ironic thing is Nvidia are pushing RT which in itself is a VRAM guzzler.

For me bottlenecks in anything recent I played (last 3 years) is probably in this order.

CPU -> VRAM -> GPU rasterization. Which is funny as the dominant opinion puts GPU first. However I dont play at high framerates so routinely my GPU is at low to medium utilisation. My CPU platform upgrade easily feels more impactful on my games than my last GPU upgrade.

For me its GPU>CPU>Vram, I do not play competitive games or simulator/heavy strategy games but pretty much everything else goes. 'currently playing Far Cry 6 maxed out and it does max out my GPU during heavy fights/scenes'
At my resolution and at 75 Hz refresh rate I find myself running out of GPU power the first in more demanding or newer games, I guess thats a normal thing with weaker cards like mine. '3060 Ti is still stronger than what most ppl have outside of forums/sites like this, supposedly'

Also, not every game is sensitive to maxed out Vram usage, some games wont even stutter if you go over the limit, it really depends on the game and some games will stutter no matter what cause they are just crap like that. 'say hi to the infamous UE 4 stutters..'
 
Last edited:
If the game ever goes on sale I will take a peek myself.
It's on a 25% sale on Epic right now.

I also took a look with a 1660 Ti because my 2070 seems to be half dead at this point. Besides the loss of average performance, the game stutters quite a bit more, and the LOD quality drop is way more noticeable. As a conclusion, I'd say that despite the game being able to allocate 11-13 GB VRAM, 8 GB is what it really needs. 6 GB gives you stutters and object drop-ins that you don't want.
 
I'd say that despite the game being able to allocate 11-13 GB VRAM, 8 GB is what it really needs. 6 GB gives you stutters and object drop-ins that you don't want.
Yeah, the GPU has to resort to system RAM (much slower) when it runs out of available VRAM, resulting in increased latency, stuttering, FPS drops.
 
For me its GPU>CPU>Vram, I do not play competitive games or simulator/heavy strategy games but pretty much everything else goes. 'currently playing Far Cry 6 maxed out and it does max out my GPU during heavy fights/scenes'
At my resolution and at 75 Hz refresh rate I find myself running out of GPU power the first in more demanding or newer games, I guess thats a normal thing with weaker cards like mine. '3060 Ti is still stronger than what most ppl have outside of forums/sites like this, supposedly'

Also, not every game is sensitive to maxed out Vram usage, some games wont even stutter if you go over the limit, it really depends on the game and some games will stutter no matter what cause they are just crap like that. 'say hi to the infamous UE 4 stutters..'
Yes, we play different games so different observations.
 
And I am thoroughly nutter so I elaborately and extensively tweak (and freak) Cyberpunk 2077 over and over and over and over again. At 4K + FSR at Performance, playability with RT Reflections enabled is a given if your GPU is something like RTX 4070 Super.

But I got an another dozen-gigabyter. RX 6700 XT wasn't made for RT whatsoever, let alone 4K+RT. But this + tweaked configuration files + modded UHD textures = I finally made use of 11.5 of 12 GB I had onboard. The game itself is borderline with FPS hovering around 43 give or take 5 FPS with episodous leans to either 30 or 60 FPS. But with double the scaling, RT Reflections are great even despite me using 1080p internal rendering.

Anyway, realistically, an average Joe runs out of VRAM... way after he achieves stutterfest due to CPU/GPU/RAM being too sluggish and no additional VRAM can help the matter. Of course some games are much heavier on the VRAM than on the computing power but these are outliers anyway.
 
Yeah, the GPU has to resort to system RAM (much slower) when it runs out of available VRAM, resulting in increased latency, stuttering, FPS drops.
Absolutely. The big question is, do you notice when the game touches system RAM to account for the lack of VRAM? Based on what I saw, I'd say that with 8 GB VRAM, you don't, but with 6, you do.
 
Everything can be tested very easily, is it the VRAM enough or not?
Start Superposition in this mode, choose how much VRAM it needs to allocate (as you see it can be really much), minimize it, start your game/bench/whatever, and test.

I tested it with Shadow of the Tomb Rider, but because I do not have a swap file and everything is in the RAM/Cache programs, the frame drop was really low and there is no stuttering, but in your systems will be interesting :D
When I start TR, my allocated RAM is 32GB, at the end of the bench it goes to 38.5GB, so part of this 6.5GB is transferred from the VRAM to RAM.

If you looking into whether the game really needs all this VRAM - as we know, everything is about game optimizations, which is correlated to the game studio that developed the game. So how much or did really AMD/Nvidia pay for these "optimizations" we can only speculate and nothing more.


1705845047294.png


Screenshot 2024-01-21 161007.png
 
I'm playing FarCry6 these days.
Monitor is 3440x1440 (4.95MP) but I've set the rendering at x1.5
That is 5160x2160 (11.15MP). Everything else is maxed out with HD textures on.
For comparison 4K is 8.3MP

The avg VRAM usage is between 14-15GB and peaks at 16+
System RAM is loaded no more than 8-9GB from game (after 2hours).
Avg FPS is around 95 with lows around 75.
Gameplay is very smouth

I should notice here that monitor is working great with VRR enabled from adrenalin, even though its a Gsync certified one. That contributes on smoothness.

I'm very curious about this though:
Untitled_59.png


Does anyone know what the "GPU Memory Usage" stands for?
 
aces_2024_08_16_21_51_13_085.jpg

iam just using default movie setting gfx with DLSS balanced only, but the vram usage is 9,6gb for 3440x1440, with graphic quality just like that.... ??

aces_2024_08_17_10_18_42_542 a1.jpg

sometimes, after playing 1-2 hours, vram usage is 10gb ?
eventhough after coming back to my hangar ?
what the hell....
 
View attachment 359324
iam just using default movie setting gfx with DLSS balanced only, but the vram usage is 9,6gb for 3440x1440, with graphic quality just like that.... ??

View attachment 359323
sometimes, after playing 1-2 hours, vram usage is 10gb ?
eventhough after coming back to my hangar ?
what the hell....
It might not be the game's fault. After reading about this in another thread, and doing some investigation of my own, I've started to see Windows using more VRAM after closing a game than before. On a fresh boot, VRAM usage sits around 1.5 GB on my 6750 XT, but after closing a resource-heavy game, it can stay up at 2.5-3 GB for some reason.
 
You can do that with a mainstream platform, too. With B650, you get 28 PCI-e lanes coming from the CPU (1x 16 for GPU, 2x 4 for NVME and x4 for the chipset), and several others from the chipset. My basic m-ATX MSi Pro board has an x4 slot that can easily fit an m.2 adapter, plus an x1 slot. It also comes with integrated wireless. You get even more with X670. If you need more than 16 CPU cores, though, fair enough.
Sorry this is so old but you are absolutely right. You can hookup a lot of shit to a consumer computer, at least what Flyordie has. I have 2 SATA ssds connected, a DVD drive, 4 NVMEs, 3 connected using 4x4 lanes and one using 3x1. Then the gpu of course using 4x16 lanes. And I haven't even used up all my lanes. I had more 1 nvme hooked up at one point, another single laner. So I actually still have 3 lanes left ( though they are only the pcie 3 ones) And I do still have 1 more sata port left too and its one that doesn't interfere with any of the pcie connections ( I know thats not necessarily the case on all boards unfortunately though).

Anyway I just wanted to say you were right, you don't need a threadripper to connect a lot of stuff. I mean you don't even need the best chipset (B760 here). Not that it wouldn't be interesting to play around with a threadripper, I just don't think most people need that.

It might not be the game's fault. After reading about this in another thread, and doing some investigation of my own, I've started to see Windows using more VRAM after closing a game than before. On a fresh boot, VRAM usage sits around 1.5 GB on my 6750 XT, but after closing a resource-heavy game, it can stay up at 2.5-3 GB for some reason.
Interesting, I always monitor my vram per process, so this must have been a blind spot for me.

Anyway, its almost like games seem to be getting better at controlling their vram use lately. I guess developers/porters are learning to move some stuff out of vram that doesn't need to be there, into system ram. I'm playing SandLand right now at 4k max settings and its not even passing 6GB. I guess that makes sense since it says it runs on a 1060. But I usually don't take that to mean with maxed out settings. That or... the settings don't really do much hehe... guess thats probably it.

EDIT: And totally just remembered the 1060 had a 3GB version... that makes more sense now. if 4k high settings is using 5.2GB of vram then perhaps 1080p low settings, is using less than 3.
 
Last edited:
vram usage is 10gb
It doesn't necessarily mean the game consumes that much even if it has that all allocated for it. You should monitor your frametime graph and if there's not a single noticeable spike and it's butter smooth and you can't see any textures becoming distorted or missing then it's just allocation and real use is definitely below 10 GB (assuming you're on an RTX 3080 10 GB).

So, yeah, if you're short on VRAM you will notice:
• Ridiculous lag spikes;
• Distortion or lack of textures.
These symptoms don't always come together but sometimes they do. Even if you have both these symptoms it doesn't mean you're definitely short on VRAM. Sometimes, it's happening because of game/OS/driver bugs and not due to hardware limitations.
 
It might not be the game's fault. After reading about this in another thread, and doing some investigation of my own, I've started to see Windows using more VRAM after closing a game than before. On a fresh boot, VRAM usage sits around 1.5 GB on my 6750 XT, but after closing a resource-heavy game, it can stay up at 2.5-3 GB for some reason.
It doesn't necessarily mean the game consumes that much even if it has that all allocated for it. You should monitor your frametime graph and if there's not a single noticeable spike and it's butter smooth and you can't see any textures becoming distorted or missing then it's just allocation and real use is definitely below 10 GB (assuming you're on an RTX 3080 10 GB).

So, yeah, if you're short on VRAM you will notice:
• Ridiculous lag spikes;
• Distortion or lack of textures.
These symptoms don't always come together but sometimes they do. Even if you have both these symptoms it doesn't mean you're definitely short on VRAM. Sometimes, it's happening because of game/OS/driver bugs and not due to hardware limitations.

well, usually, if war thunder has exhaust my VRAM usage, the game will automatically changed the high texture setting into medium texture setting....

last year, i have been complaining about this bug about heavy VRAM usage to gaijin customer service, but they dismiss my complain, because they say its normal, there is no bug, they say.

i have been complaining to moderator forum in war thunder steam, but again, they dismiss my complaining about that.

yes, its reality in the war thunder.
 
It might not be the game's fault. After reading about this in another thread, and doing some investigation of my own, I've started to see Windows using more VRAM after closing a game than before. On a fresh boot, VRAM usage sits around 1.5 GB on my 6750 XT, but after closing a resource-heavy game, it can stay up at 2.5-3 GB for some reason.
Or it could be the driver's fault, or it could be intentional and not a fault to begin with.

Without actually knowing how the driver manages device memory, especially its policies towards garbage collection, we can't really conclude that this is an issue. For all we know, it could be some caching/optimisation voodoo or the driver simply not bothering too much on wasting time doing unnecessary cleanup.
It would be a problem if the memory isn't freed when another process needs it.
 
how much vram would be enough vram for 1440p in 2025 in your opinion?
 
how much vram would be enough vram for 1440p in 2025 in your opinion?
12gb, but I can imagine instances where using DLAA+FG simultaneously could result in a spill. FrameGen uses about 0.5GB with dlss, but the higher the internal rresolution, the more it goes towards ~1GB.
10GB will often cause problems, we've seen that in 2024 games already when at max.
 
how much vram would be enough vram for 1440p in 2025 in your opinion?
It depends.

Enough for now? Enough for now plus 1 year? 2? 3, 4, 5, 6, 7+?

You could get a lot of different answers here. Some might say what's perfectly enough today, some might say an amount that has you covered for a decade because that's what they think a card should last for.

Me? Buying a new card today? 1440p needs 12GB, 16GB preferably and that'll buy you 3+ years. Get more VRAM if you desire it.
 
It depends.

Enough for now? Enough for now plus 1 year? 2? 3, 4, 5, 6, 7+?

You could get a lot of different answers here. Some might say what's perfectly enough today, some might say an amount that has you covered for a decade because that's what they think a card should last for.

Me? Buying a new card today? 1440p needs 12GB, 16GB preferably and that'll buy you 3+ years. Get more VRAM if you desire it.

i guess it would have to last at least 3 years ideally. But too many variables to be able to predict the future, i was just asking for the immediate future. We can always dial back the settings in the future.
 
Enough VRAM doesn't mean the GPU will be strong enough to play with them, so when you turn off a few settings, VRAM and GPU should be ok.
 
Back
Top