• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why Are Modern PC Games Using So Much VRAM?

Not to diminish the detail in which this topic can be discussed at, but I may be getting old, because I think all-this is not an issue and just happened to be trendy to talk about for a few weeks. Trends come and go and this one probably came up due to lack of any more noteworthy news in gaming. Meanwhile I'm just happy more games are being released, and not just AAA but many indie games as well, even 2D. And that my mid-range GPU is way more than enough for all of my gaming needs. Especially since I can always lower settings to medium-high if it ever comes to that. Or might upgrade my GPU in the future, idk. So yea, things are good. More indie and AAA games are released these days than ever before, game engines are plenty and more accessible than before, internet is fast, can download any game in the world and it all works on one computer. I mean it's a great time to game. Not sure why I responded here, guess due to lack of a diary. The personal conclusion of this diaretical post is that I should pay more attention to news in other, more interesting and important things, and less in gaming news. Thank you for baring with my post, have a wonderful day whichever side of the discussion you hold, if any! <3
 
@dirtyferret

Why Are Modern PC Games Using So Much VRAM?

To be honest i've been more curious in the last 6 years as to why modern games weren't pushing eye-candy/striving-realism (increasing VRAM utilisation) far enough to unavoidably muster support for higher VRAM provisions. Likewise, it's annoying we haven't seen directstorage take off. Your linked article does a great job in summarising at length some of the concerns i've had over the years. Personally i've always considered VRAM "as one of the primary" offenders for slower progress in graphics fidelity. Developers were never going to push on this sort of thing when the broader consumer base is limited to ~8GB.

The problem with these discussions is timing. AMD sniping at Nvidias lack of provisioning and a handful of titles now exceeding the 8GB limit was always going to invite unpaid aimless brand advocates to hijack these discussions. Doesn't matter how these discussions come to light, its a 101% valid one. The long awaited tipping the scales in favour of increased VRAM'ification is finally hitting home although both Nvidia and AMD are hardly helping at the lower and more important stacking order of GPUs. Actually since AMDs rant, i'd be gobsmacked if the 7600XT is actually what it seems, a 8GB card.

I'm just wandering what everyones opinions are on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities?
 
Actually since AMDs rant, i'd be gobsmacked if the 7600XT is actually what it seems, a 8GB card.
I'm ok with the 7600XT having 8GB of ram. In fact I think the entry level GPU market needs more cards and is under served. Not everyone wants to play the latest and greatest PC games or drop $400 plus just to play online games with their friends. At its leaked price of $299, is it a bit steep? 100% Is it a bargain compared to what Nvidia is offering? Most likely.

I'm just wandering what everyones opinions are on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities?
This really is a qualitative question because your opinion of "richer" may not match mine or other people's and vice versa. What the article stated was the RAM limitations of the previous consoles did limit what could be shown graphically and with developers solely focused on the current generation of consoles we are now seeing the need for more VRAM in gaming cards in order to achieve smooth FPS performance. Now if seeing more detail in clothing creates a richer graphic experience than great. I would personally prefer richer stories and more mentally rewarding game play than "dodge and strike" but that's me.
 
@dirtyferret




To be honest i've been more curious in the last 6 years as to why modern games weren't pushing eye-candy/striving-realism (increasing VRAM utilisation) far enough to unavoidably muster support for higher VRAM provisions. Likewise, it's annoying we haven't seen directstorage take off. Your linked article does a great job in summarising at length some of the concerns i've had over the years. Personally i've always considered VRAM "as one of the primary" offenders for slower progress in graphics fidelity. Developers were never going to push on this sort of thing when the broader consumer base is limited to ~8GB.

The problem with these discussions is timing. AMD sniping at Nvidias lack of provisioning and a handful of titles now exceeding the 8GB limit was always going to invite unpaid aimless brand advocates to hijack these discussions. Doesn't matter how these discussions come to light, its a 101% valid one. The long awaited tipping the scales in favour of increased VRAM'ification is finally hitting home although both Nvidia and AMD are hardly helping at the lower and more important stacking order of GPUs. Actually since AMDs rant, i'd be gobsmacked if the 7600XT is actually what it seems, a 8GB card.

I'm just wandering what everyones opinions are on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities?

I'm not all that concerned with advancements in graphics at this point. Gains to the untrained eye are perceptually incremental. Certain top-flight games from as long as ten years ago still look great today. I'd prefer development resources be spent on design, gameplay, art direction and playability; visuals are now in the realm of diminishing returns re: effort v. reward.
 
sorry about the confusion

"game devs" or "game developers" don't just mean the people involved, just as "game publishers" don't mean any individual, but the companies, i think. In Redfall the game developer is Arkane (not some code monkey) and the Publisher is Bethesda, it's on the steam page for the game.

That's a huge distinction to make. Businesses will decide to take a certain tack when needs arise. The tone gets set from the top and if a dev shop pushed something out too fast and they knew that they were doing that, it's a disgrace to the industry. Most of the time, a balance needs to be struck between doing stuff right and doing stuff quickly because nobody is going to wait 15 years for the absolutely perfect game to be made and absolutely no investor is going to take on that kind of risk.

What you're describing is business practices more than anything else and a tone being set by executive management. If I were in that kind of organization, I'd probably get fired for speaking up, which I do quite often when I think something is a bad idea and I'm fortunate to work with leaders who are receptive to that sort of pushback when the need arises. However, a lot of shops aren't like that which is bad.
 
If I were in that kind of organization, I'd probably get fired for speaking up, which I do quite often when I think something is a bad idea and I'm fortunate to work with leaders who are receptive to that sort of pushback when the need arises. However, a lot of shops aren't like that which is bad.

Oh boy do i know the feeling, i got in so much trouble over the years for speaking my mind when things made no sense. I just can't be a corporate robot. Now i work for myself, and for the better.

absolutely

You are a consumer but are you the customer in that scenario I just posted?
me: consumer

customer as in whoever is the customer of the job (not me as a consumer), Bethesda, your boss, your bosses boss, the contrator of the subcontractor, whoever. I guess like you i think in one language and write as good as i can in another so things may be lost in translation. Also a bit dyslexic and it doesn't help.
 
I'm ok with the 7600XT having 8GB of ram. In fact I think the entry level GPU market needs more cards and is under served. Not everyone wants to play the latest and greatest PC games or drop $400 plus just to play online games with their friends. At its leaked price of $299, is it a bit steep? 100% Is it a bargain compared to what Nvidia is offering? Most likely.


This really is a qualitative question because your opinion of "richer" may not match mine or other people's and vice versa. What the article stated was the RAM limitations of the previous consoles did limit what could be shown graphically and with developers solely focused on the current generation of consoles we are now seeing the need for more VRAM in gaming cards in order to achieve smooth FPS performance. Now if seeing more detail in clothing creates a richer graphic experience than great. I would personally prefer richer stories and more mentally rewarding game play than "dodge and strike" but that's me.

I'm not all that concerned with advancements in graphics at this point. Gains to the untrained eye are perceptually incremental. Certain top-flight games from as long as ten years ago still look great today. I'd prefer development resources be spent on design, gameplay, art direction and playability; visuals are now in the realm of diminishing returns re: effort v. reward.

but....

...."opinions on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities?"
 
"For example, in version 1.03 of The Last of Us, we saw average VRAM figures of 12.1 GB, with a peak of 14.9 GB at 4K Ultra settings. With the v1.04 patch, those values dropped to 11.8 and 12.4 GB, respectively. This was achieved by improving how the game streams textures and while there's possibly more optimization that can be done, the improvement highlights how much difference a little fine-tuning on a PC port can make."

As i mentioned in the other hungry VRAM topic - which it's something i noticed time after timer after time - optimizations tend to have a huge saying - especially when it comes to a Console to PC port.

Anyway, Ultra High Settings - are usually meant for High End GPUs. Sure, would be nice (or ideal - a gaming Utopia if you will) - for mid range GPUs to be packed with +16 GB VRAM so even the mid-class can get a taste of ultra settings at 30+ FPS - but, unfortunately... even hardware developers/manufacturers (GPUs in this case) - have a thing for a monopolized hierarchy. The only thing that usually works against corporate monopoly is "competitions". Maybe if 3DFX and 6x others were still in the game vs AMD, nVidia & Intel - the equivalent of a 4090 with 40 GB VRAM would be a mid range product by now. Yet, that's pure fantasy - far from the world we live in - where's the only 2 main players - while the 3rd (Intel) - joined the upper class game only recently and needs to level up. That being said, i get it... the peasant life is also about complaining and begging for higher VRAM crumbs - since the bourgeois got greedy again and raised the taxes (charging more for less). Looking back - with the price of a cow (GTX 1xxx) - you can barly buy a pig (RTX 4xxx) this days... skinny one too (didn't even had the decency to feed him enough VRAM for what they're charging).
 
but....

...."opinions on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities?"

Not really. I feel like software is a bigger limiting factor. It's easy to throw memory at a problem; how many PCs have twice as much RAM as they should realistically need because of poor memory management on the application side? You can have all the VRAM in the world, but if scheduling is off your game can still run like crap. Was the recent discussion about the limitations of the Creation engine family in this thread? We're probably looking at similar problems right now: The hardware can potentially do what we're asking of it, but we're not making the requests in quite the right way.
 
It seems to me the mod scene for PC gaming has been dying for some time -- which is why it was shocking Metro Exodus came out with a map editor supported by the developer.
 
It seems to me the mod scene for PC gaming has been dying for some time -- which is why it was shocking Metro Exodus came out with a map editor supported by the developer.

If it’s dying it’s from lack of innovation in games, not from gamers or the whole ecosystem of gaming being X86-64.

We keep getting the same on rails shooters, racing, and RTS games.

We need immersive games, out of our reality games but we aren’t far enough into the cycle of culture to allow it yet. Maybe in 10-15 years we will have games we can laugh at and enjoy again.

As far as system specs, going from 480 to 600x800, or 768x1024, to 1080 was huge. But 4K HDR is not a linear jump, it’s exponential and 8 or 16K is the same.
 
Not really. I feel like software is a bigger limiting factor. It's easy to throw memory at a problem; how many PCs have twice as much RAM as they should realistically need because of poor memory management on the application side? You can have all the VRAM in the world, but if scheduling is off your game can still run like crap. Was the recent discussion about the limitations of the Creation engine family in this thread? We're probably looking at similar problems right now: The hardware can potentially do what we're asking of it, but we're not making the requests in quite the right way.

Theres not an ounce of disagreement here only there's an inkling at the back of the head suggesting VRAM limitations are a real problem and one that is difficult to deny now considering we're already seeing games spilling over 8GB (not to even mention RT/PT/co). I totally agree its not going to impact everyones expectations or leanings... but i'm trying to wrap my head around why the borderline limitations and not easily provisioned higher capacities to open doors for even greater possibilities (actually not just possibilities but easing up on some of the overly optimal riff raff which dilutes quality consistency or trims assets altogether). At the rate these card manufucterers are dropping price bombs, is it really going to bankrupt the gaming community for a small premium to grow into 12GB or at their respective resolutions 16/24/etc. Or are GPU manufacturers calculatively managing our affairs for us with a road-map to repeat sales more promptly.

I just cant help wandering whether developer optimisations can be eased off a bit away from packing more into less with a larger pallette to begin with. I also wander if inadequate VRAM offerings have been and continue to burden developers in running their hands into less than desirable strenously enduring optimisations which now seem beyond perfecting with current or more likely forthcoming demands. Nowadays, in some instances without even maxing out on VRAM horribly drafted real-time dynamic asset substitutions emerge. For some this won't be a problem depending on the game type/setting or even resolution but for the eye-candy enthusiast the problems have been noticeable for some time with textures shifting to-and-from, distance scaling waning, etc... in short less that consistent visual fidelity for those who desire it. Optimisations nowadays smell more like compromises or you're compelled to spend bigger for the better stuff (or is that the plan all along).
 
"opinions on whether VRAM limitations have played a part in the slow expansion of the much desired richer/realer graphics possibilities
absolutely it has, why develop something that hardware really can't use
 
OP, you could defuse this problem by getting older games & modding them to your liking.
For the last 6 months I've done nothing with gaming but get into Oblivion GOTY edition with mods & nearly whole of game overhauls with mod mega packs. A lot of newer games are just rehashes of plots & storylines from older games anyway, so you won't miss much in that dept. The golden years for innovative PC gaming was decades ago (1990-2010). Not only that, but the modding community has matured a lot of the mods that came out when those older games were still new, so the incidence of bugs with those mods is now very limited indeed - another bonus for the end player! :)
 
Well Nvidia fighting to keep VRAM low, they are developing something called Neural Texture Compression to reduce texture size.


Actually these are the types of advances i'm absolutely delighted to hear about but certainly not at the cost of deliberated VRAM limitations or feature strap-on higher premiums. The more efficient and less memory starved real-time texture compression techniques the better! Path tracing... neural compression advances... AI artistry.... the near future > feast for me eyes!!!
 
Yet I still remember back in 2005 when people said that 256MB is overkill.... how far have we came.
 
Have a look at this, 4 times higher resolution with 30% less memory usage. Neural Texture Compression


30% means your 8gb vram becomes 10, your 12 becomes 16
And 4 times means your 1080p becomes 4k :peace:

book-ntc-vs-bc.png


And they have NeuralVDB which uses 100 times less memory to render smoke, fire, water, clouds...
Timestamp 0:48 you can see a 14.9gb cloud simulation is reduced to 666 megabytes:


rdr2-coat-340x400.jpg
 
Last edited:
Actually these are the types of advances i'm absolutely delighted to hear about but certainly not at the cost of deliberated VRAM limitations or feature strap-on higher premiums. The more efficient and less memory starved real-time texture compression techniques the better! Path tracing... neural compression advances... AI artistry.... the near future > feast for me eyes!!!
The issue is, as usual it looks like it will be proprietary, it uses their tensor cores, so of course it couldnt be deployed as a generic solution.
 
Have a look at this, 4 times higher resolution with 30% less memory usage. Neural Texture Compression


30% means your 8gb vram becomes 10, your 12 becomes 16
And 4 times means your 1080p becomes 4k :peace:

book-ntc-vs-bc.png


And they have NeuralVDB which uses 100 times less memory to render smoke, fire, water, clouds...
Timestamp 0:48 you can see a 14.9gb cloud simulation is reduced to 666 megabytes:


rdr2-coat-340x400.jpg
Nice, when Ada and Ampere have turned obsolete, we might see this in a game
 
The issue is, as usual it looks like it will be proprietary, it uses their tensor cores, so of course it couldnt be deployed as a generic solution.
It can't be proprietary if they want it to be adopted. Game devs are not going to ship two game versions with two wildly different texture budget AND texture decompression cost which seems to be x3 compared to BC7 on the hardware they tested.
Pretty cool tech nonetheless.
 
Nice, when Ada and Ampere have turned obsolete, we might see this in a game

I wouldn't be surprised if nvidia locks out previous generation gpus from using it like with DLSS3
 
I wouldn't be surprised if nvidia locks out previous generation gpus from using it like with DLSS3
This. Their business strategy is just so greedy that I can't personally allow it any more. Going with Radeon from now.
 
This. Their business strategy is just so greedy that I can't personally allow it any more. Going with Radeon from now.

Both camps are greedy. It's the nature of business. Nvidia is more greedy than AMD. True, but hey, those designer black leather jackets Huang wears aren't cheap. :D
 
Back
Top