• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why Are Modern PC Games Using So Much VRAM?

I think hostility is a bit much in most cases and much of it is aimed at Publishers pushing games out before they are ready and in some cases broken. I think "fed up" would be a more appropriate term. Have a look at this:
I don't believe the hostility (personally) is towards those people who create games, I still believe there is a strong level of gratitude towards them. I believe the hostility is towards the publishers and GPU tech companies that increasingly treat the consumer as a product that they deliver to their shareholders (their one true consumer).
I'm seeing the shift in mentality indeed but for the longest time (and today still depending on who or where you ask) the nuance was completely lost to everyone.

The response from these companies is just buy more, just pay more, but consumers have the right to say what are we getting for our money and we want better.
Yes but we don't have to listen do we ? We can lower the settings to high instead of ultra and claw back several GB of vram and tens of % of fps for a marginal difference in quality and no difference in enjoyment. Not talking about completely broken releases that run terribly on any hardware of course. This is actually one of the points of the article that is imo much more important than our perceived competence of game devs :)
 
Because optimizing games for PC is way down the list of priorities + pubs cut costs on QA. So first devs don't get enough time to fix even the basics (games launch with game-breaking bugs so what smoothness do you expect), then the broken product they're about to ship is barely tested, finally the end result is obvious for everyone to see. As I posted before in another thread: these games absolutely don't look better or in any way ground-breaking to justify the load on merit, it's just that not enough time and resources were allocated to ship them in a respectable state.
Why publishers mismanage their teams and products (games) to this degree is a topic for another thread altogether.
 
An excellent read and explanation even if it is on a different web site


Some interesting points

But where PCs keep the meshes, materials, and buffers in VRAM, and the game engine (along with a copy of all the assets being used) in the system memory, consoles have to put everything into the same body of RAM. That means textures, intermediate buffers, render targets, and executing code all take their share of that limited space. In the last decade, when there was just 5 GB or so to play with, developers had to be extremely careful at optimizing data usage to get the best out of those machines.

This is also why it seems like games have just suddenly leaped forward in memory loads. For years, they were created with very tight RAM restrictions but as publishers have started to move away from releasing titles for the older machines, that limit is now two to three times higher....

Thus, if a next-gen console game is using, say, 10 GB of unified RAM for meshes, materials, buffers, render targets, and so on, we can expect the PC version to be at least the same. And once you add in the expected igher detail settings for such ports, then that amount is going to be much larger....

There are other, somewhat less nefarious, factors that will also play a part – development teams inexperienced in doing PC ports or using Unreal Engine, for example, and overly tight deadlines set by publishers desperate to hit sales targets before a financial quarter all result in big budget games having a multitude of issues, with VRAM usage being the least of them.

We've seen that GPU memory requirements are indeed on the rise, and it's only going to continue, but we've also seen that it's possible to make current games work fine with whatever amount one has, just by spending time in the settings menu. For now, of course.

This article seems to justify it by saying these new games use high VRAM on PCs because they are ported from console, and that's how the new consoles work. However, what about titles like Jedi Survivor, which just released simultaneously on console and PC. This game is said to have the same problem, yet how could it be ported from console if it released simultaneously on PC?
 
This article seems to justify it by saying these new games use high VRAM on PCs because they are ported from console, and that's how the new consoles work. However, what about titles like Jedi Survivor, which just released simultaneously on console and PC. This game is said to have the same problem, yet how could it be ported from console if it released simultaneously on PC?
The artile itself is wrong as the driver for Dx12 don't much now a days. Where they're claiming Software is tracking Vram by drivers, software can't do that on Dx12. It's the O/s & the game talking to each other directly. As windows keeps track of the Vram & extends into the Ram when it goes over that amount of that it tells the game what is has.
 
Well considering SONY used Iron Galaxy instead of their own porting studio Nixxes, we can assume it was at least a scheduling issue of Nixxes not being able to hit a preferred launch window or Iron Galaxy offering a lower bid or both. In any scenario, the reason is money (shocker). Iron Galaxy has a history of having port issues like Batman Arkam and I believe Uncharted as well.

My guess is NIxxes is working on porting somthing else and considering how close they are to Guerrilla Games logistically I wouldn't be surprised if it's Horizon Forbidden West.
 
The artile itself is wrong as the driver for Dx12 don't much now a days. Where they're claiming Software is tracking Vram by drivers, software can't do that on Dx12. It's the O/s & the game talking to each other directly. As windows keeps track of the Vram & extends into the Ram when it goes over that amount of that it tells the game what is has.

I play The Last of Us Part I on PC at 1080p using all High Textures (including Characters, Environments, Dynamic Objects, and Effects). I know it's not the same as VRAM, but as far as RAM usage goes, the allocated amount is always WAY higher than the amount actually used. Is W10 misreading how much it needs then? And if so, why can't the game engine itself calculate the allocation?
 
I play The Last of Us Part I on PC at 1080p using all High Textures (including Characters, Environments, Dynamic Objects, and Effects). I know it's not the same as VRAM, but as far as RAM usage goes, the allocated amount is always WAY higher than the amount actually used. Is W10 misreading how much it needs then? And if so, why can't the game engine itself calculate the allocation?
It's not misreading anything It just sets aside of system resources ram that doesn't allow anything else to popup into.
when you look at the DXdailog box it will show up Vram+virtual memory share amount for overflow from the gpu.
 
Yes but we don't have to listen do we ? We can lower the settings to high instead of ultra and claw back several GB of vram and tens of % of fps for a marginal difference in quality and no difference in enjoyment. Not talking about completely broken releases that run terribly on any hardware of course. This is actually one of the points of the article that is imo much more important than our perceived competence of game devs
I agree, in fact I believe most PC gamers just leave the graphics on the default settings. This forum (and other like it) represent the vanguard of PC hardware modern tech where gaming is considered at very high to ultra settings. Like the article stated and the last line I quoted on my initial post, you can get games to play by just adjusting settings and 8GB cards are still viable to play PC games..for a time.
This game is said to have the same problem, yet how could it be ported from console if it released simultaneously on PC?
Most games are designed for console first even if they are releases simultaneously and most game engines are designed for console optimization

Why publishers mismanage their teams and products (games) to this degree is a topic for another thread altogether.
I agree with you, the teams are mismanaged. Consoles will typically sell more games and/or at higher prices than their PC equivalent so they take up more man power for many of these developers. I beleive that's the real issue, the console versions just take up so much manpower especially as the launch window nears and the developers are understaffed to make better profit margins.
 
  • Like
Reactions: Kei
Current gen consoles have 16gb ram, most games today are built with consoles limitations, then ported to pc, that's why.
Seems most are replying here without reading the article and understanding this. :)

--

What I find funny is if a game needs more VRAM its all the dev's fault.

But if a game needs DLSS to reach normal performance, its the customer's fault for not upgrading.

My opinion is yes dev's could do more to make things work better on PC, but also Nvidia (the prime culprit here) needs to stop these very open planned obsolescence practices with their VRAM capacity.
 
Seems most are replying here without reading the article and understanding this. :)
It's the third point I quoted in the initial post
 
We might ask why HD mods from communities are so large and require or max out current generation hardware. Hundreds or thousands of unique textures and HDR color for each.

Vmem is cheap, but the PCB and die space is expensive. We could have a 128Gb card with a 64bit wide access and slow timings.
 
They really aren't using that much VRAM; just look how much disk space a game nowdays demands, why only a patch for that Jedi game recently is 100GB, no wonder they use so much VRAM, there's lots of textures and lots of detail included in the scenes.
If you look at Portal_2, it takes about 11 GB of diskspace and back then the graphics card with most VRAM was 3GB, so roughly 1/3 of the size of the whole game and you could use the highest settings.
If we translate to today's games that use 100+ GB, it is normal to expect 1/3 or 1/4 of the game-size VRAM to work with the highest settings. Simple as that. :-)
 
People got lazy during covid or they feel behind on their work so they cut corners just like Pc part manufacturers or the studios are putting too much pressure on the developers etc...
 
Seems most are replying here without reading the article and understanding this. :)

--

What I find funny is if a game needs more VRAM its all the dev's fault.

But if a game needs DLSS to reach normal performance, its the customer's fault for not upgrading.

My opinion is yes dev's could do more to make things work better on PC, but also Nvidia (the prime culprit here) needs to stop these very open planned obsolescence practices with their VRAM capacity.
What planned obsolescence? Tlou dropped memory usage by 20% + while simultaneously making lower textures look better with just patches. Jedi boosted performance by up to 75% within a few days. 75%. Do you realize how huge that is? Yeah, for sure let's blame nvidia for lazy devs.

Funny thing is, jedi drops to 680p and 17 fps on consoles as well. I'm sure that's because of Nvidia as well
 
Even this overgrown VRAM chip has come out demanding "WE WANT MORE!! WE WANT MORE!!"

Capture.JPG
 
I'm having issues with this bit:
Previously, there was no need to create highly detailed assets, using millions of polygons and ultra-high-resolution textures. Downscaling such objects to the point where they could be used on a PS4, for example, would defeat the purpose of making them that good in the first place. So everything was created at a scale that was absolutely fine in native form for PCs at that time.
That's not very accurate.
Plenty of modeling techniques generate high polycounts by default. Sculpting and SubD modeling come to mind. I've vaguely recall seeing several pipelines that intentionally produce high polycount models to do stuff like AO baking and normal-mapping. It's relatively easy to downscale a model. Besides, models weren't just used for in-game rendering. You also had pre-rendered cenematics, marketing, etc.
High-poly models may not always be rigged and made usable in the engine, perhaps. But they are made.

On topic:
I'm not exactly sure what everyone is mad about.
If you want 2015 hardware utilization, you can dial back the settings to 2015 equivalents. Those who can perceive the difference in quality get their eye candy, those who can't get the choice for better performance. Win-win!
 
the amount of PC gamers is gonna be pretty small if your thinking is correct.. :)

trog
actually its the opposite effect, then pc gaming would have a larger benefit over any other platform at a negligible price increase on hardware but easier to make game for and better results. do not think you thought this thru.
 
If the rumoured PS5 Pro comes with even more VRAM this problem might get even worse.
This is not a problem, it is only perceived as such by people who buy into cards with too little VRAM.

Ergo anything Nvidia has produced since Ampere. The warnings were given, it wasn't true for a vast majority here and elsewhere. But honestly what's up in the OP isn't news, its not rocket science, it was even explained at the PS5 reveal. And even if you're a pure PC guy you should've known better, because RT will take VRAM as well on top of the game.

Customer. Due. Diligence.

It beats ignorance, and that includes the eternal whine about devs not optimizing muh games. This argument is as old as gaming, and its true and false all at the same time, but totally irrelevant. You're doing this to play games after all, so they need to run proper.
 
This is not a problem, it is only perceived as such by people who buy into cards with too little VRAM.

Ergo anything Nvidia has produced since Ampere.
But that doesn't explain how patches dropped vram usage by 20+% while upgrading the image quality of textures (tlou). Or how in 5 days a patch boosted performance by up to 75% (jedi). I mean come on, no matter how much vram was available, devs can always get lazier and use up all of that regardless

Hogwarts day one was using over 21 gb of vram on my 4090. Yeah sure if you have a 64 gb vram card then you wouldnt have an issue regardless, but is it really necessary to have all that vram?

It doesn't explain either how jedi renders at 648p and drops to 17 fps on a ps5. All that shared ram pool didn't seem to do any good either.
 
But that doesn't explain how patches dropped vram usage by 20+% while upgrading the image quality of textures (tlou). Or how in 5 days a patch boosted performance by up to 75% (jedi). I mean come on, no matter how much vram was available, devs can always get lazier and use up all of that regardless

Hogwarts day one was using over 21 gb of vram on my 4090. Yeah sure if you have a 64 gb vram card then you wouldnt have an issue regardless, but is it really necessary to have all that vram?

It doesn't explain either how jedi renders at 648p and drops to 17 fps on a ps5. All that shared ram pool didn't seem to do any good either.
Your examples are triple A releases that are shit on launch.

This isn't a VRAM issue. Its a triple A issue. Corporate, is the explanation, no need to look for further reason to that madness. It isn't new either, but since PC and PS5 share code base, they now also share the day one cesspool.

None of this however relates in any way to the fact that EVEN with patches these new games need a lot of VRAM. Most definitely more than 8GB, and likely ever more often, more than 12GB for all bells & whistles.

Again, we need to stop dodging reality, requirements go up and they happen because of console developments/mainstream gets bumped.
 
Your examples are triple A releases that are shit on launch.

This isn't a VRAM issue. Its a triple A issue. Corporate, is the explanation, no need to look for further reason to that madness. It isn't new either, but since PC and PS5 share code base, they now also share the day one cesspool.

None of this however relates in any way to the fact that EVEN with patches these new games need a lot of VRAM. Most definitely more than 8GB, and likely ever more often, more than 12GB for all bells & whistles.

Again, we need to stop dodging reality, requirements go up and they happen because of console developments/mainstream gets bumped.
I'm happy for vram requirements to go up. Way up. I wouldnt mind textures needing 20+ gb of vram on textures. My issue is - some or actually a lot of these games that use that much vram don't look better. Sometimes they don't even look nice. That's the main reason I'm very skeptical about the whole "we need more vram". For what? For worse looking games?

With that said jedi looks actually nice, but it's not one of the games that hog vram, it plays fine even on 8 gb cards.
 
I'm happy for vram requirements to go up. Way up. I wouldnt mind textures needing 20+ gb of vram on textures. My issue is - some or actually a lot of these games that use that much vram don't look better. Sometimes they don't even look nice. That's the main reason I'm very skeptical about the whole "we need more vram". For what? For worse looking games?

With that said jedi looks actually nice, but it's not one of the games that hog vram, it plays fine even on 8 gb cards.
Well I think you're right but also that part of this is a twisted view on things.

Its like boiling a frog in hot water right. We slowly get our increased fidelity so we might not appreciate it as much, the wow effect isn't there. Some titles stand out, others don't, some stand out on their art design, some because of particle effects, yet others because they're great at showing you a huge open world with great view distance. Again others stand out because they started combining these qualities, because all game design is in a way iterative even between publishers/studios that compete. Everyone builds on the ideas of others. Today we get games that can combine qualities and still produce highly playable FPS - the Crysis days are definitely behind us, no studio wants to go there anymore and frankly they don't have to.

We get many assets on screen and many more draw calls on the API, and still load the game up further with HD textures and mods. We have a Total War game that encompasses the campaign map of three games now including all of their factions. Etc. etc. We can play Riftbreaker with 200+ aliens on screen at north of 120 FPS. Games finally use all our CPU cores.

All of these new games are on a new level of fidelity, its just that baseline that was elevated, but you still need to combine qualities of things we've learned from the past to make something stand out. And even then you're not catching as much attention as you used to get because the baseline is elevated regardless. The games will always require talented art design and game design to really make them special. Those don't require a specific increase of VRAM. The new baseline however, does - irrespective of how well a game is optimized.

I think a lot of people keep forgetting, and this is where corporate/big publishers drop the ball entirely, too, that you really dó need dedicated, motivated and talented teams to make truly good content. You need to give them time, resources, creative freedom, and a whole lot more to facilitate a truly great title.

I then look at studios like Larian (Divinity OS, Baldurs Gate 3) and I see something the way its meant to be done. You can just feel the love there if you watch any content they put out. Everything else looks, feels, secondary to making a great game, they're in their own little safe bubble doing their thing. That's the dedication you want to see and those are the only games/studios where I have some semblance of faith left, to be honest. This is also why indie devs tend to produce mechanically much better games. They love what they make, and they want to explore the concept. And by combining great art design with solid mechanics and gameplay, you really do get something bigger than the sum of its parts, the game might even 'look better' just because it all clicks, not because it uses whatever high resolution of assets.
 
Last edited:
This is not a problem, it is only perceived as such by people who buy into cards with too little VRAM.
You really don't see a problem with games requiring increasingly more VRAM while looking no better than games 3-5 years old?

I mean, I don't suggest we should sit on 8GB forever, but why demand 16GB just to do the job that was done by 8GB a few years ago?
 
You really don't see a problem with games requiring increasingly more VRAM while looking no better than games 3-5 years old?

I mean, I don't suggest we should sit on 8GB forever, but why demand 16GB just to do the job that was done by 8GB a few years ago?
No, I think its a fact that games are just going to get bigger and demand more, mostly due to new pipelines/engines etc. It becomes a problem when games are no longer fun to play / there are no killer apps to sell new hardware, etc. - That is why VR and even RT just won't go places for example. And quite honestly, I think the recent releases are kind of in that problematic corner. So in that sense I agree, but in a general sense, I absolutely don't.

Again: the mainstream is the mainstream, and the console mainstream VRAM capacity is 16GB. We can talk all day about what is shared and what is different about the memory systems in consoles, but if you haven't got that 16GB bottom line on a new PC GPU of some calibre these days, you simply have a shit piece of hardware that offers less than the mainstream requirement. Games need to be optimized, at the same time, hardware needs to fit the general norm of what's expected, you can't expect devs to keep burning money and hours on a segment of the market unwilling to move on. Its not how it works either. The majority market for gaming is consoles. Especially when you look at what's released that requires a bit of GPU beyond IGP/low budget, the overwhelming majority is console-first. PC gamers are simply going to have to get on that train or they're screwed. The market they represent per game is minor, or equal to consoles at best. This new paradigm has been creeping in since the PS4, shouldn't be news anymore.

They can also buy a console and forfeit all the PC advantages like modding, freedom/no ecosystem with single gatekeeper, and freedom wrt input devices, additional software, hardware etc. And then five years later the market will look different and they might switch back. This happens all the time, I recognize it in my own gaming history as well. Whatever. The PC will always be the open garden and will always drive innovation, regardless of GPU spec or price.
 
Last edited:
Back
Top