• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

I went pretty deep into it from release, because thats the best moment to be entering a new persistent economy... oh man. Good times, but the bugs... the exploits... the HACKS... The Dark Zone was still fun even despite all of this, it was super vibrant in the early days.

I had a 980ti and just remember being underwhelmed even at 4k and just remember the boss type enemies just being bullet sponges with no real difficulty.

I did try it out again a couple months after launch after it got some patches and I updated to dual 1080s and still couldn't get into it.

One of my fave, I am trying to paint an objectively fair picture though.

To be fair though this is how it was originally shown in 1996

the-resident-evil-2-that-wasnt-1548897790.jpg

Gotta pour one out for Elza Walker lol....
 
I had a 980ti and just remember being underwhelmed even at 4k and just remember the boss type enemies just being bullet sponges with no real difficulty.

I did try it out again a couple months after launch after it got some patches and I updated to dual 1080s and still couldn't get into it.
The Division was intended initially more as a PVPVE game. The Dark Zone represents this best. Did you play that? Because later in its dev cycle post release, the game nudged strongly to the PVE side of it, as the PVP economy was screwed from the beginning and balance wasn't great. And the irony of it... around the time they got to fleshing out different gear sets for specialization and rebalancing the whole thing, PVP in the open world was dead, but at that point it did actually work. The PVP early on was still awesome though. Voice chat in squads was positional - you could hear the chat from other teams if they got close to your team, depending on distance. The Dark Zone mechanics were fantastic too; attack another team and steal their hard earned loot while they wait for the heli to secure it... Chases through the streets and in buildings. The Rogue mechanic that would mark aggressive teams on the map for everyone. Teams banding together or turning on each other. Of course that came with all of its own problems, but still. The idea. The way it did work when it worked. I'll never forget it.
 
The Division was intended initially more as a PVPVE game. The Dark Zone represents this best. Did you play that? Because later in its dev cycle post release, the game nudged strongly to the PVE side of it, as the PVP economy was screwed from the beginning and balance wasn't great. The PVP was awesome though. Voice chat in squads was positional - you could hear the chat from other teams if they got close to your team, depending on distance. The Dark Zone mechanics were fantastic too; attack another team and steal their hard earned loot while they wait for the heli to secure it... Chases through the streets and in buildings. Teams banding together or turning on each other. Of course that came with all of its own problems, but still. The idea. The way it did work when it worked. I'll never forget it.

I did but there were a ton of people using hacks and while probably the most interesting part of the game I had 0 faith in Ubisoft to get that working properly.
 
I did but there were a ton of people using hacks and while probably the most interesting part of the game I had 0 faith in Ubisoft to get that working properly.
They didn't :D
 
That's what I mean though. It is loaded with assets, so it is in use.
Loaded and in use are not the same thing, as my example indicates. When your VRAM is overflowing with assets in use, you feel it. When it's being loaded with assets for later use, you don't.
 
Loaded and in use are not the same thing, as my example indicates. When your VRAM is overflowing with assets in use, you feel it. When it's being loaded with assets for later use, you don't.

The other really hard thing is some games are fine for hours and then start stuttering while others never load in high quality textures but otherwise perform fine.

Vram isn't an exact science though when you run out you run out just hopefully that's at least 3-4 years after you purchased your gpu. Even though I upgrade every generation I still keep gpus for 4-5 years I personally think they should at least be usable without turning down texture quality for 4 years that hasn't always been the case.

My 680s lost steam about 2 years after release in some games, my 970s after 2, both my Titan Xp and 2080ti never faced such issues over the four years I kept them, and now I will see if my 12GB 3080ti last another generation only time will tell. Guessing my 4090 will be fine but who knows if it'll make it till the 6090 comes out lol or maybe 6070 by then not sure if I can keep affording nearly 2000 usd gpu's lol.

290X lasted a while though the last high end ish amd gpu I purchased.
 
I might be wrong but even if the texture resolutions are insanely high with advanced shaders etc at the end of the day the amount used these days might be due to very bad optimizations, especially console ports, none of them are good. Anyway at some point we are going to hit a perf wall and the only improvements will come from optimizations. I remember playing Doom 3 and Quake 1 with some relief textures shaders and UHD textures packs 10+ years ago and it was running just fine with no more than 2GB of VRAM. Anecdotal but still.
 
Operating systems are also using more and more memory, even flavor's of Linux are no longer lightweight anymore.
 
I might be wrong but even if the texture resolutions are insanely high with advanced shaders etc at the end of the day the amount used these days might be due to very bad optimizations, especially console ports, none of them are good. Anyway at some point we are going to hit a perf wall and the only improvements will come from optimizations. I remember playing Doom 3 and Quake 1 with some relief textures shaders and UHD textures packs 10+ years ago and it was running just fine with no more than 2GB of VRAM. Anecdotal but still.

The issues we face now on PC is stutters due to how texture decompression works on PS5/Xbox series so even when vram allocation isn't bad a lot of console ports stutter and especially any UE 4/5 game having a 13900k/14900k/4090/7900XTX can minimize it a bit but this has been a plague on pc ports all year.
 
I might be wrong but even if the texture resolutions are insanely high with advanced shaders etc at the end of the day the amount used these days might be due to very bad optimizations, especially console ports, none of them are good. Anyway at some point we are going to hit a perf wall and the only improvements will come from optimizations. I remember playing Doom 3 and Quake 1 with some relief textures shaders and UHD textures packs 10+ years ago and it was running just fine with no more than 2GB of VRAM. Anecdotal but still.
That reminds me of mods for Doom 3 with ultra low res textures and no effects whatsoever. It looked like ass, but at least it ran on a GeForce 4. Or the mod called Oldblivion. Speaking of which, I remember how turning grass distance up in Oblivion tanked performance even on high-end cards. Or Half-Life 2, that despite running well on mid-tier hardware, also had official DirectX 7 support built-in so you could play it on a brick.

Games having massive hardware requirements isn't a new thing - devs always want their games to be top notch. It's only natural. We just don't have mods like the above anymore.
 
The closest thing for me is DXVK, no not the Windows one as it seemingly don't work in all games, in a Linux environment.


I don't use Arch BTW... ;):shadedshu::roll:

Nobara is my go to which is debian based and performs better than in that video with correct GPU stats.


Windows GOW on a Radeon never has such a flat frame time graph. The DX11 implementation here is terrible but also effects some Geforce cards, the forum is filled with both sides.
 
the only joke is the laughably low amount of VRAM on most modern GPUs.
I think the bigger joke is how poorly some games memory management or perhaps room for improvement there is?
 

Attachments

  • IMG_3867.png
    IMG_3867.png
    277.4 KB · Views: 112
I think the bigger joke is how poorly some games memory management or perhaps room for improvement there is?
TW3 is not that good looking and wasn't upon release either, but that's my opinion.
 
TW3 is not that good looking and wasn't upon release either, but that's my opinion.
Even if you think it's ugly, you surely agree that it's not 1-1.5 GB VRAM use ugly. ;)

Or if you look at some games that use 10+ GB, they don't look 10 times better.
 
Even if you think it's ugly, you surely agree that it's not 1-1.5 GB VRAM use ugly. ;)
First game to cripple 1.5GB Vram was BF4 for me.



Both BF games, massive in MP and visually impressive for their time. Both fit within 2GB too and use very low poly assets, but for real reasons.. maps are just too big.

Game engines go through different changes depending on what the vision is of the creator, and not always are they met with yes's from the consumer.
BF went downhill after BF1.


Speaking of TW3..

The style it's self is a challenge for my taste, overly saturated, unreal looking world.
The game takes it's self seriously and fans expect us to, but it looks almost like a 3D rendition of Dennis The Menace.



Grim Dawn released a year later. Needs about 512MB VRAM (Pre graphics update).

 
Last edited:
Even if you think it's ugly, you surely agree that it's not 1-1.5 GB VRAM use ugly. ;)

Or if you look at some games that use 10+ GB, they don't look 10 times better.
Yeah when I first time playing Witcher 3 I'm pleasantly surprised how low the VRAM reading is I thought something is wrong with MSI Afterburner, turns out it really uses very little amount of VRAM and looks miles ahead of other games that eats like 3-4x that amount. Thought it will set the benchmark for other devs to follow as it was very popular but meh. IMO it still one of the best games in recent times, others was Titanfall 2 the SP campaign surprisingly good, it incorporate some other ideas from other games akin to what Singularity did, another good title.
 
Yeah when I first time playing Witcher 3 I'm pleasantly surprised how low the VRAM reading is I thought something is wrong with MSI Afterburner, turns out it really uses very little amount of VRAM and looks miles ahead of other games that eats like 3-4x that amount. Thought it will set the benchmark for other devs to follow as it was very popular but meh. IMO it still one of the best games in recent times, others was Titanfall 2 the SP campaign surprisingly good, it incorporate some other ideas from other games akin to what Singularity did, another good title.

It also looks much better texture quality wise with the Halk Hogan mod I was using it long before CDPR decided to incorporate it into the next gen version of the game and even with it I think usuage was around 4GB at 4k.

Not sure how much it will come across but here is vanilla witcher 3 with the mod. Haters be damned I still love the look of this game and also the Next gen version of it.

Screenshot (87).png

A game doesn't need to have realistic visuals to be appealing though I really like Tales of Arise aesthetics also super low vram usage at 4k.

Screenshot (56).png
 
Last edited:
i remember when i got hd 6850 back in the day and skyrim looked so good at those high settings with stable 60fps and not only that a fairly demanding game of it's timeline, Max Payne 3 ran fantastic on high settings consuming only like 724mb at 1080p and it had impressive graphics for it's period. I also played GTA 5 on normal settings and it still looked decent though i didn't get anywhere near 60fps but that was expected with a 5 yr old gpu at that time. For sure i can say games do look alot better now but not as much as gpu intensive they are getting.
 
I am curious of the effect of texture filtering quality on NV control panel on VRAM usage, we know textures consume VRAM, and this setting I have never understood what it actually does, so have always left on high quality, but might start running tests, including visual quality testing to see of its impact on VRAM.

Also on those downplaying textures quality, on a game I take part in modding, if all you do is load in texture mod's, the game is completely transformed and wouldnt look out of place as a new game in 2023, yet it was released in 1998.

The driver developer also added real time lighting, this lighting however unlike typical RT, doesnt consume a ton of GPU resources, it barely has any impact at all as well as not requiring RT gpu's, but does enhance the game significantly (as the stock game has pretty much no lighting effects not even pre RT lighting stuff).
 
The Witcher 3 and Skyrim prove that we live in the "diminishing returns" era of computer graphics. 1% better graphics come at a 100% higher performance cost these days.
 
The Witcher 3 and Skyrim prove that we live in the "diminishing returns" era of computer graphics. 1% better graphics come at a 100% higher performance cost these days.

Witcher for sure is a good example the RT version of it can look fantastic and even stand up to much newer releases in a lot of aspects but the performance cost is a 4090/4080 and a high end CPU.

Definitely an argument can be made that the ends don't justify the means but I am glad it exists and honestly hardware just need to get better at it we need 4090 performance to hit like 300-400 usd max I am really worried about stagnation at the lower end though especially after seeing the 7600/4060/4060ti to get a card you really feel good about you need to spend 500 with AMD and 550 with Nvidia in my opinion of course and that is just too much given the relative performance.

The biggest mistake PC gamers make mentally is to think just because they can afford the high end the low end doesn't effect them these are the cards 3060 ish like performance that developers are going to target so if there is stagnation that is just another 2 years of targeting really weak hardware. Sure the consoles play a big part of that as well and the tow stronger ones are 6700XT like performance but PC still matters at the lower end.
 
The Witcher 3 and Skyrim prove that we live in the "diminishing returns" era of computer graphics. 1% better graphics come at a 100% higher performance cost these days.

True, I also think some of it is engine and modern dev practices though.

I expect if dev's were tasked today of remaking games like skyrim and oblivion on latest engine, with same visual's as the older games it would require extra resources to do the same thing.
 
1697831427395.png


Classic Remedy game. 100% gpu bound at any settings and resolution.
You can use a potato for a cpu, if you want.
 
View attachment 318351

Classic Remedy game. 100% gpu bound at any settings and resolution.
You can use a potato for a cpu, if you want.
I already miss the days when system requirements quoted actual rendered resolutions, not some upscaled mumbo jumbo. Like this:

Raster
RTX2060 / RX6600 - 720p on low with 30 fps
RTX3060 / RX6600XT - 835p on medium with 30 fps
RTX3070 / RX6700XT - 1080p on medium with 30 fps, or 540p with 60 fps
RTX4070 / RX7800XT - 1080p on high with 60 fps

RT/PT
RTX3070 / RX6800XT - 720p on medium + RT low with 30 fps
RTX4070 - 720p on medium + RT medium + PT with 60 fps
RTX4080 - 1080p on high + RT high + PT with 60 fps
 
View attachment 318351

Classic Remedy game. 100% gpu bound at any settings and resolution.
You can use a potato for a cpu, if you want.
There must be a typo there - 1080p medium requires a 3070 or 6700 XT, but for 1440p, a 3060 or 6600 XT is enough?

Witcher for sure is a good example the RT version of it can look fantastic and even stand up to much newer releases in a lot of aspects but the performance cost is a 4090/4080 and a high end CPU.
That brings up the question: is the RT version so much better that it justifies needing a 4080, whereas the vanilla version runs on a decade-old 960?

True, I also think some of it is engine and modern dev practices though.

I expect if dev's were tasked today of remaking games like skyrim and oblivion on latest engine, with same visual's as the older games it would require extra resources to do the same thing.
Are you saying that older engines are better in this regard?

I am curious of the effect of texture filtering quality on NV control panel on VRAM usage, we know textures consume VRAM, and this setting I have never understood what it actually does, so have always left on high quality, but might start running tests, including visual quality testing to see of its impact on VRAM.
I think all that toggle does is change the level of anisotropy, which has minimal impact on performance (someone correct me if I'm wrong).

Also on those downplaying textures quality, on a game I take part in modding, if all you do is load in texture mod's, the game is completely transformed and wouldnt look out of place as a new game in 2023, yet it was released in 1998.
That's why I think that texture quality and geometry is what gives the most detail, not lights. Focusing all resources on RT while textures (especially on skin) still look like it's 2015 is the wrong direction, imo.
 
Back
Top