• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Games that require more 24 VRAM...

Status
Not open for further replies.
If someone wants to release high-resolution texture patches for all games, specifically targeting mid- and high-end PCs, I wouldn't complain. It seems to me that the large proportion of users with relatively low VRAM limits the willingness of studios to apply better textures.
Storage limitations are also a thing. Textures take space and quite a lot of it. With game installs being the size they are these days keeping it at a manageable size are definitely a strong factor for devs in deciding what textures to apply.
 
As it stands, games that NEED more than 8GB VRAM if you want the best visual experience.

Resident Evil 4 Remake, it will simply crash out.

Games that will give you a shaky experience due to VRAM.

Far Cry 6 + HD textures. (Frame rates down to 7 FPS)
Hogwarts Legacy. (Blurred textures and higher resolution assets not loading in plus 1% stutter)
Ratchet & Clank. (Around 20 FPS loss)
 
Storage limitations are also a thing. Textures take space and quite a lot of it. With game installs being the size they are these days keeping it at a manageable size are definitely a strong factor for devs in deciding what textures to apply.
I'd say it would be perfectly conceivable to release the high-resolution textures in a patch to be downloaded separately, and it would also reduce the size of the base game.

Devs don't always use the best compression techniques either, probably also a legacy of the low core count that intel created.
 
I'd say it would be perfectly conceivable to release the high-resolution textures in a patch to be downloaded separately, and it would also reduce the size of the base game.

Devs don't always use the best compression techniques either, probably also a legacy of the low core count that intel created.

Yup, until we have to abandon the quad-core "ice age" and Nvidia's low-VRAM performance segment GPUs, games will suffer a bit. Unfortunately, gamers still moan and complain that AVX support is required in newer games, indicating many are still stuck in the Core2/Phenom era... and some even expressed outrage at the fact Horizon Forbidden West requires a 12 year old Ivy Bridge processor (128-bit AVX + F16C instructions) to run. Muh 2600k! How dare you!

Gamers can be such sore losers.
 
I am pretty sure Sandy Bridge and AMD FX were the lowest supported chips for AVX extentions, and by that metric would be passable as playable CPU's now, IE 30 FPS...

I doubt H:FB will run well on IVY unless it is a XEON.

Ahhh...

 
Last edited by a moderator:
I am pretty sure Sandy Bridge and AMD FX were the lowest supported chips for AVX extentions, and by that metric would be passable as playable CPU's now, IE 30 FPS...

I doubt H:FB will run well on IVY unless it is a XEON.

Ahhh...


Sandy Bridge (i7-2600K) does support AVX, but it does not support F16C - that was added on Ivy Bridge (i7-3770K). This extends to the i7-3000X CPUs (Sandy Bridge-E), F16C being present only on i7-4000X CPUs (Ivy Bridge-E). AMD Heavy Equipment chips including Bulldozer and Piledriver should but I do not know for certain
 
I don't have to, most of the generation runs fine on average cards.
But i do have a few tesla and radeon pro cards.
More complex things, more the vram needed.

Without GPU you can use GPT but not a very complex one:
GPT4All

Generate pictures, stable diffusion offline needs GPU (without GPU is not efficient, using all my overclocked 18 core i9 and eats up 400w)
Portable version.
View attachment 333872
It is fast enough to generate 512x512 pictures, but with more vram it can generate larger more complex pictures.
The most important is the vram size and speed, average nvidia cards are faster. (optimized for cuda)
I got 4-6 images/minute with a cheap mining card (P104-100 ~GTX1070) but my second A770 16Gb is much faster, but cant do cuda so a lot of tweaking needed to use the intel Arc AI acceleration.

Im currently working to make skyrim chatgpt all npc mod to run localy. (now i use openai.com API but that is not free) That need a second cuda capable GPU to run offline.


Oh my god....why?

You're going to have Bethesda decide that instead of writing their games they can streamline their entire development into AI enhanced garbage. Stop....Please....no more.




Joking aside, can you not see that happening? Between Fallout 4 being what it was, an arrow to the knee being Skyrim in a nut shell, and their stated goal of making "everything evolve organically so that you cannot tell the difference between a crafted mission and a generated one" that this is screaming for actually happening? I...am sad that your work (if ever released) would be entirely within the Bethesda wheelhouse to steal and repurpose so they can crank out a Fallout or Elder Scrolls game every other year.
 
On max settings BeamNG can take over 24GB of Vram :) don't ask how I know.
 
On max settings BeamNG can take over 24GB of Vram :) don't ask how I know.
Great now nVidia has a reason to create a 48gb consumer grade card for 5 grand.
 
I created this thread specially to inspect with games requires more 24 VRAM, running max settings. Owns RTX 3090, RtX 4090, RX 7900 XTX be helpful. Why? We want see a RTX 5090s or new top AMD RX Radeon at least 32 VRAM, to receive max resolution available the best image quality and stable performance.

First game that require more 24 VRAM at 8K resolution max settings + RT is Ratchet and Clank.

Several of games require more than 24gb vram at 8k, even with dlss ultra performance. Another one is witcher 3 remaster.

Tons are right on the edge with 20+ gb vram usage at 8k.

And for the "smart" people saying hUrR dUrR, ItS jUsT vRaM AlLoCaTiOn !!!!11 - no, it isn't. Performance tanks as soon as vram consumption reaches 24gb.

TdZdCYc.jpg


rCQhwzA.jpg


Can't reliably even see a difference at 4K vs 8K.
And when the point comes along where you can, you would need a room so big that only those who are really well off and living comfy lives can experience it since the majority don't have large enough rooms to house the correct sized panel to make a visual difference.

The issue is that screens rely upon the viewer being able to see and discern all parts of the image, so sitting closer is not a thing unless you want the majority of the screen space outside of your peripheral vision.

Spoken as someone who clearly hasn't seen 8k...

4k

VkBdl3J.jpg


8k

NjUXnb3.jpg
 
Last edited:
Several of games require more than 24gb vram at 8k, even with dlss ultra performance. Another one is witcher 3 remaster.

Tons are right on the edge with 20+ gb vram usage at 8k.

And for the "smart" people saying hUrR dUrR, ItS jUsT vRaM AlLoCaTiOn !!!!11 - no, it isn't. Performance tanks as soon as vram consumption reaches 24gb.







Spoken as someone who clearly hasn't seen 8k...


Neither have you, you downsample to 4K.

I find it hilarious you think blown up screenshot where people are about 2 feet from the screen equates to 100 inch screens where we have to press our faces up to the panel to see a difference.

 
Neither have you, you downsample to 4K.

I find it hilarious you think blown up screenshot where people are about 2 feet from the screen equates to 100 inch screens where we have to press our faces up to the panel to see a difference.


I use a samsung qn900c 65" tv - tell me more about what i have or don't have...

The difference in image clarity and detail is glaringly obvious while playing.

And the fact that you use linus videoes as proof for anything just says everything i need to know about you.

Edit : you've been a member for 2 days and have posted 56 times?! Screams fishy...
 
I use a samsung qn900c 65" tv - tell me more about what i have or don't have...

The difference in image clarity and detail is glaringly obvious while playing.

And the fact that you use linus videoes as proof for anything just says everything i need to know about you.
Here is my 8K video...


Which by the way you are not at 8K, you are using reconstruction just like me with this video.

Don't make it personal, you can't tell a difference at normal viewing distances.
 
Here is my 8K video...


Which by the way you are not at 8K, you are using reconstruction just like me with this video.

Don't make it personal, you can't tell a difference at normal viewing distances.

Are you trying to be funny here? A 1440p youtube video...

I use dlss, yes, which is vastly superior to amd's solution. Even dlss ultra performance is only a very slight reduction in image quality at 8k.

Again you're trying to tell me what i can or can't do xD You very obviously haven't tried 8k, so i won't waste anymore time on you, cause what you are saying is only based on assumptions on your part, and what you've read other people say (who haven't tried it either).
 
So not 8K then, go back to the drawing board and stop pushing your ego and need to flaunt your purchase.
You also have not seen nor tried 8K.

That was easy, thank you for your cooperation.
 
So not 8K then, go back to the drawing board and stop pushing your ego and need to flaunt your purchase.
You also have not seen nor tried 8K.

That was easy, thank you for your cooperation.

I think we got the reason for your many posts during these last 2 days... obvious troll.

On the ignore list you go.
 
Several of games require more than 24gb vram at 8k, even with dlss ultra performance. Another one is witcher 3 remaster.

Tons are right on the edge with 20+ gb vram usage at 8k.

And for the "smart" people saying hUrR dUrR, ItS jUsT vRaM AlLoCaTiOn !!!!11 - no, it isn't. Performance tanks as soon as vram consumption reaches 24gb.

TdZdCYc.jpg


rCQhwzA.jpg




Spoken as someone who clearly hasn't seen 8k...

VkBdl3J.jpg


NjUXnb3.jpg
Exactly, high resolution is the best anti-aliasing method available. It's beautiful.
 
Exactly, high resolution is the best anti-aliasing method available. It's beautiful.

That, but even more so the extra image clarity and detail :D
 
8k will look better, more resolution usually does. Is it worth it... well yes if you have a 8k display, large enough to really enjoy it. But for the rest of us with desktop displays in the range of 24" to 30+ish inches and having to use downscaling to get a glimmer of it... not as much. Doesn't help that 8k hasn't exactly taken off like a rocket :(

Nice screenshots btw @Dragam1337 :). Now we just need more 8k displays :D
 
Not only this. Also game requiring some amount of VRAM is quite different from what amount of VRAM it can/will use if more is available. Dynamic texture pools have been a thing for decades.
Yup and that's what causes the conundrum. You can't tell exactly how much VRAM a game uses because it will use different amounts in different situations and with different hardware.
 
Game Dev's focus on optimizing and make their games run good on 12 and 16gb max atm. And there would be no need for more then 20 gb in the near future aswell. The majority is on 12 and 16gb so this will stay for at least few more years this way. The primary focus will be to optimize for this specs where the majority of gamers currently use. If you make your game right now with at least 20 gb gpu requierment or higher you rule out almost 90%+ of all your customers. So expect nothing really breaking the 24 gb barrier anytime sooner then in 4-5 years.
 
Game Dev's focus on optimizing and make their games run good on 12 and 16gb max atm. And there would be no need for more then 20 gb in the near future aswell. The majority is on 12 and 16gb so this will stay for at least few more years this way. The primary focus will be to optimize for this specs where the majority of gamers currently use. If you make your game right now with at least 20 gb gpu requierment or higher you rule out almost 90%+ of all your customers. So expect nothing really breaking the 24 gb barrier anytime sooner then in 4-5 years.

Lol, classic case of posting without reading the anything in the thread, not even OP's post...
 
Game Dev's focus on optimizing and make their games run good on 12 and 16gb max atm. And there would be no need for more then 20 gb in the near future aswell. The majority is on 12 and 16gb so this will stay for at least few more years this way. The primary focus will be to optimize for this specs where the majority of gamers currently use. If you make your game right now with at least 20 gb gpu requierment or higher you rule out almost 90%+ of all your customers. So expect nothing really breaking the 24 gb barrier anytime sooner then in 4-5 years.
Not necessarily. With proper optimization, GPUs with lower VRAM capacities could handle medium settings, reserving higher graphics configurations for models with 16GB or more.

Btw, it's unlikely that developers under Nvidia's influence would implement such optimizations, as most studios receive active financial $$$ support from Nvidia. Imagine if the 4070 couldn't run games on high settings; it would be chaotic.
 
Not necessarily. With proper optimization, GPUs with lower VRAM capacities could handle medium settings, reserving higher graphics configurations for models with 16GB or more.

Btw, it's unlikely that developers under Nvidia's influence would implement such optimizations, as most studios receive active financial $$$ support from Nvidia. Imagine if the 4070 couldn't run games on high settings; it would be chaotic.
Possible for sure and it may happen but very rarely. like 95% from all games they will stay near the 16gb max for some time. There will be some games that will push boundary's like you say with some Ultra or Extreme preset where they will go over 16gb maybe around the 20gb mark but those would be still really rare and exception. If the avarage cost for the lowest 16gb Nvidia card like the 4070 ti super is around 800-850$ then it makes no sense to push even harder for game devs. They still want to reach the majority of player and for sure some enthusiast with some extreme presets but even those enthusiast will struggle to get 20gb out of nvidia lol
 
Linus did tests betwen 8k gaming and 4k. There was practically no difference at all in IQ.

Why bother. Wait until 16k or higher. No reason at all to change standards with such a minimal increase in resolution fidelity.

I personally am looking beyond 8k. Pointless upgrade right now.
 
Status
Not open for further replies.
Back
Top