• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Games that require more 24 VRAM...

Status
Not open for further replies.

conundrum reminds me of this
QZVqP1k.gif
 
Can't reliably even see a difference at 4K vs 8K.
And when the point comes along where you can, you would need a room so big that only those who are really well off and living comfy lives can experience it since the majority don't have large enough rooms to house the correct sized panel to make a visual difference.

The issue is that screens rely upon the viewer being able to see and discern all parts of the image, so sitting closer is not a thing unless you want the majority of the screen space outside of your peripheral vision.
 
ray tracing enabled of course

giphy.gif
 
Can't reliably even see a difference at 4K vs 8K.
And when the point comes along where you can, you would need a room so big that only those who are really well off and living comfy lives can experience it since the majority don't have large enough rooms to house the correct sized panel to make a visual difference.
I agree completely. It gets better though because the only thing I can think of that would be large enough to be 8K would be a TV. I learned something about gaming on TVs years ago (because that's what I use for my PC display, a 55" 4K TV), something I never expected (but should have). My TV seems to have a hardware upscaler and this makes sense because a standard 480p signal would look terrible on a 55" display if it was just played native. Sure, there are HD TV channels but even those only tend to be 720p. I did an experiment with the Far Cry 5 benchmark with my RX 5700 XT. I ran it at 720p, 1080p, 1440p and 2160p. Each time I did, I had my face to the screen, looking for noticeable differences in things like foliage texture and (especially) water quality. I ran the gamut several times in different orders of resolution:

Run #1 - 720p - 1080p - 1440p - 2160p
Run #2 - 2160p - 720p - 1440p - 1080p
Run #3 - 1440p - 1080p - 720p - 2160p
Run #4 - 1080p - 2160p - 720p - 1440p

I did this to avoid my eyes just getting used to slightly higher resolutions each time. At the end of the test, I had to conclude that I couldn't see ANY difference whatsoever. I spoke to Jim from AdoredTV about it and he found it very surprising as well. I then told him about my suspicions that big-panel TVs might have upscalers to prevent SD signals from looking bad and he seemed pretty sure that had to be it (or as he said "Either that or your eyes are out of bloody whack!". I swear, the Scots have a great sense of humour! :laugh:
The issue is that screens rely upon the viewer being able to see and discern all parts of the image, so sitting closer is not a thing unless you want the majority of the screen space outside of your peripheral vision.
Yeah, but as I said, if it's a TV, it might not matter how close to the screen you are because if it has a hardware upscaler, it might just end up looking the same. As for displays, yeah, it would have to be pretty huge, like the size of a non-stadium movie screen. I know from my own experience that you'd be pretty hard-pressed to tell 720p from 1080p on a 15.6" craptop. For this reason, even though my craptops have a maximum resolution of 1080p, I just run them at 720p to save power and heat because I can't tell a difference one way or the other.

The dumbest thing that I've ever seen is the Sony Xperia 4K phones. Actually, that's not true because at least it seems to be selling (although I'd hate to see the IQ test results of the noobs that actually pay over $1,100 for these things). The dumbest thing that I've ever seen is someone actually complaining on Reddit that nobody else has bothered to release a 4K smartphone:
Why is Sony still the only phone maker creating smartphones with 4K displays 8 years later?

It's like it never occurred to this noob that 4K is completely useless on a 5.5" display. :kookoo:
 
what thread is this? 24gb not even in 12k any game. tested it.

8gb is enough for 98% of games on the market in 4k. 12gb is enough for 99%. and 16 everything including 8k.
 
what thread is this? 24gb not even in 12k any game. tested it.

8gb is enough for 98% of games on the market in 4k. 12gb is enough for 99%. and 16 everything including 8k.

8 GB runs into limitations in many games at 1440. 12 GB is the sweet spot there but realistically you want a 16 GB GPU for smooth 4K gaming. 24 is great for some extra bells and whistles, but not necessary for gaming right now. Remember that 4K is 400% of 1080p but 1440p is only 178% of the resolution. It's a far bigger leap than most realize.

I agree completely. It gets better though because the only thing I can think of that would be large enough to be 8K would be a TV. I learned something about gaming on TVs years ago (because that's what I use for my PC display, a 55" 4K TV), something I never expected (but should have). My TV seems to have a hardware upscaler and this makes sense because a standard 480p signal would look terrible on a 55" display if it was just played native. Sure, there are HD TV channels but even those only tend to be 720p. I did an experiment with the Far Cry 5 benchmark with my RX 5700 XT. I ran it at 720p, 1080p, 1440p and 2160p. Each time I did, I had my face to the screen, looking for noticeable differences in things like foliage texture and (especially) water quality. I ran the gamut several times in different orders of resolution:

Run #1 - 720p - 1080p - 1440p - 2160p
Run #2 - 2160p - 720p - 1440p - 1080p
Run #3 - 1440p - 1080p - 720p - 2160p
Run #4 - 1080p - 2160p - 720p - 1440p

I did this to avoid my eyes just getting used to slightly higher resolutions each time. At the end of the test, I had to conclude that I couldn't see ANY difference whatsoever. I spoke to Jim from AdoredTV about it and he found it very surprising as well. I then told him about my suspicions that big-panel TVs might have upscalers to prevent SD signals from looking bad and he seemed pretty sure that had to be it (or as he said "Either that or your eyes are out of bloody whack!". I swear, the Scots have a great sense of humour! :laugh:

Yeah, but as I said, if it's a TV, it might not matter how close to the screen you are because if it has a hardware upscaler, it might just end up looking the same. As for displays, yeah, it would have to be pretty huge, like the size of a non-stadium movie screen. I know from my own experience that you'd be pretty hard-pressed to tell 720p from 1080p on a 15.6" craptop. For this reason, even though my craptops have a maximum resolution of 1080p, I just run them at 720p to save power and heat because I can't tell a difference one way or the other.

The dumbest thing that I've ever seen is the Sony Xperia 4K phones. Actually, that's not true because at least it seems to be selling (although I'd hate to see the IQ test results of the noobs that actually pay over $1,100 for these things). The dumbest thing that I've ever seen is someone actually complaining on Reddit that nobody else has bothered to release a 4K smartphone:
Why is Sony still the only phone maker creating smartphones with 4K displays 8 years later?

It's like it never occurred to this noob that 4K is completely useless on a 5.5" display. :kookoo:

Take it from me, my eyesight is bad and I use glasses to an extent that I can't function without them... Your eyesight is bad and you need to get that looked at.

You should definitely be able to tell a 720 and 1080 signal apart from a 4k one on a 55 inch 4k TV. It looks completely different! 720 (and 1440 for the same reason) looks especially hideous because the PAR for those resolutions is different and it doesn't integer scale like 1080 does onto 4k.

Upscalers are just there to smooth out the image. And they're not perfect even in high-end sets like my LG G3 OLED.
 
when you consider 5 games as many from ten thousands then sure

The point is that there are games which do so already. The tendency over time is for more games to need >8 GB cards, not less.
 
of course but i never endorsed 8gb. i only said what is true. i would not recommend 8gb for anyone unless you are only playing old games and or esport titles. otherwise 12gb at least! 16gb if you plan for 4-7 years usag of same gpu. which most do not do in a forum like this anyways. but the vram panic is bullshit. especially since its always in the contextc of maxed settings but for maxed settings the power is not there anyways before the vram becomes a bottleneck ane even if we on pc have the luxury to use settings becaue most settings also lower vram usage. asloi most modern games you wont even see a difference if you lower texturs to high from ultra but it save up to 2gb vram. and fsr and dlss also saves vram since lower render resolution.
 
After extended game play, Star Wars Jedi will report that it is using 19GB VRAM.. but I think that it is just storing data there because it can. My system memory, gradually lowers, the longer you play the game, which is why I think it is just storing it there instead of system memory.
 
22 GB is the most so far , that was Hogwarts legacy , and Cyberpunk 2077 Phantom liberty around 18GB.
 
After extended game play, Star Wars Jedi will report that it is using 19GB VRAM.
the more ram you have the more will the game use just because it can not because it needs it. avatar makes 24gb full afer a while yet its perfectly fine with 8gb in 4k. because the engine is smart af.

people also seem to forget that 8gb on a gtx 1080 is faster full than the 8gb on a rtx 3070. no excuse for that bs on the 3070 but its true. lets ee what happens whne they will ramp up cache extremely not just 40mb or 120 like on rdna 2. nvidia also has stuff in the pipeline that will make vram completely irrelevant for gaming but when its done no one knows. ai will change everything in gaming too. expect framegen in ten years that will make out of 1 frame 10 frames without you noticing anything. thats the future
 
@ojoqrom
except your forgetting pixel size.
having worked in a couple of mil store selling tvs (ship-in-shop), and using my larger screens
not just for demo vids, but gaming, to show what ppl can expect running it for more than movies/tv.

looking at the 85in (4k) i had, you need to be min of 6ft away. not because its large,
but not to have a screen door effect being visible.
with the same sized tv running 8K, you could be 3-6ft away, being much more immersive (why we go to the theaters running larger screens)
while not having an impact on IQ from low ppi.

ignoring i dont have to run stuff at native res, as the panel would still be 8K, the same way im fine playing QHD on my 50in (4K) at 2-3ft,
with newer games that dont do constant fps, even taht i have vrr.

@Avro Arrow
sure there will be ppl buying stuff just because they can/want to show it,
but the 4K xperias i looked at, are running it so it can be used as a "mobile" screen able to do "native" UHD.
i mean what normal user would need an hdmi port on their phone.
they offer a 32in OLED for +30K (last time i looked at them), which isnt intended for anyone outside shooting/grading/editing
production stuff, doesnt mean they released so someone can boast about (owning) it on YT...
 
pixel density is pretty much the most important part. most people waste 4k because they dont understand this.

i sit 2-2,40m away from my 55 inch oled when gaming. 2.40 is already to much but i lay down often when gaming so fuck it.

22 GB is the most so far , that was Hogwarts legacy , and Cyberpunk 2077 Phantom liberty around 18GB.
that means nothing we can actually not see how much vram a game really needs. only dev tools can. and no msi afterburner cant either. we only know its too much when you get the overblown vram related signs low gpu usage heavy fps drops and spikes from hell and in general much lower fps than you would have normally.
 
pixel density is pretty much the most important part. most people waste 4k because they dont understand this.

i sit 2-2,40m away from my 55 inch oled when gaming. 2.40 is already to much but i lay down often when gaming so fuck it.


that means nothing we can actually not see how much vram a game really needs. only dev tools can. and no msi afterburner cant either. we only know its too much when you get the overblown vram related signs low gpu usage heavy fps drops and spikes from hell and in general much lower fps than you would have normally.
We can see here.


Here.


Performance problems, texture looking like a PS2 games LOD texture like in Hogwarts Legacy.

RE4 crashing.
 
what thread is this? 24gb not even in 12k any game. tested it.

8gb is enough for 98% of games on the market in 4k. 12gb is enough for 99%. and 16 everything including 8k.
You had me going for a moment.... Well played! :roll:
 
pixel density is pretty much the most important part. most people waste 4k because they dont understand this.

i sit 2-2,40m away from my 55 inch oled when gaming. 2.40 is already to much but i lay down often when gaming so fuck it.


that means nothing we can actually not see how much vram a game really needs. only dev tools can. and no msi afterburner cant either. we only know its too much when you get the overblown vram related signs low gpu usage heavy fps drops and spikes from hell and in general much lower fps than you would have normally.
I have a 4090 with 24GB , which is more than enough, with DLSS not even close to running out of VRAM , 4090 will run out of steam ,before it run's out of VRAM, without DLSS its a slide show with Cyberpunk Phantom liberty at 4K PT MAX.
 
Last edited:
The issue is that screens rely upon the viewer being able to see and discern all parts of the image, so sitting closer is not a thing unless you want the majority of the screen space outside of your peripheral vision.
Isn't that called the fully immersive experience?
 
In 2024 people learn what they didn't ealier, which is softwares/games will use the amount of RAM/VRAM they are able to use in order to do the job.
Don't you guys now about swap ? Developpers do and I don't know any games that REQUIRES more than 8GB or VRAM. You got more ? The game will use more because your driver will tell the custom engine (or custom settings in the engine) it can use more, will you have more FPS ? Maybe in the 5% margin yay !
 
In 2024 people learn what they didn't ealier, which is softwares/games will use the amount of RAM/VRAM they are able to use in order to do the job.
Don't you guys now about swap ? Developpers do and I don't know any games that REQUIRES more than 8GB or VRAM. You got more ? The game will use more because your driver will tell the custom engine (or custom settings in the engine) it can use more, will you have more FPS ? Maybe in the 5% margin yay !
I sat here for a good 20 minutes thinking of a way of typing something that would help you, but I am now lost. I usually check left and right before crossing the road, here you checked nothing and you need to get hit by the truck to get checked for ill awareness.

If we apply the crossing the road analogy to what you are doing, you read nothing and viewed no videos debunking your claims.
 
Yeah, the old "Allocation vs. Usage" conundrum.... :laugh:
Not only this. Also game requiring some amount of VRAM is quite different from what amount of VRAM it can/will use if more is available. Dynamic texture pools have been a thing for decades.
 
If someone wants to release high-resolution texture patches for all games, specifically targeting mid- and high-end PCs, I wouldn't complain. It seems to me that the large proportion of users with relatively low VRAM limits the willingness of studios to apply better textures.
 
Status
Not open for further replies.
Back
Top