• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

I'm sorry if you love the game but geometry and textures are way more important for realism than lighting UE5 achieves both, if a game stil uses geometry like this RT won't help and this from the path tracing version of the game:
View attachment 317551
Control has the same issue with worse textures. With what UE5 achieves my expectations are higher.

In PS1 (and even later) era design, such a pipe would be like 16 pixels on the wall's texture.
You don't high poly minor environment objects, especially in open world games. That's optimisation 101.
 
I was thinking to upgrade my 2060 super to 3070 but the lack of vram is concerning, the other option is 4070 which is too expensive atm at least if the 4060 TI 16gb had x16 lanes as im using an older motherboard, tempted to get something from AMD with 16g of vram but im using some nvidia features and not sure if i can live without them at the moment.

For those with RT 3070 would be great if there were some 2Gig vram chips with the same timings as current chips so you can swap them (or some bios editor to change the timings after swaping).
 
I was thinking to upgrade my 2060 super to 3070 but the lack of vram is concerning, the other option is 4070 which is too expensive atm at least if the 4060 TI 16gb had x16 lanes as im using an older motherboard, tempted to get something from AMD with 16g of vram but im using some nvidia features and not sure if i can live without them at the moment.

I know what you mean I was unsure what to do when my 3070s 8gb limitation became very obvious trying to play tlou. I ended up buying a 4090 because I am irresponsible and didn't want to lose dldsr... but even as somebody whos always used nvidia cards I have to admit amd is a better value proposition right now.


For those with RT 3070 would be great if there were some 2Gig vram chips with the same timings as current chips so you can swap them (or some bios editor to change the timings after swaping).

I'm pretty sure I read about people doing just that before, but I'm sure its extremely difficult to do.
 
  • Like
Reactions: izy
I was thinking to upgrade my 2060 super to 3070 but the lack of vram is concerning, the other option is 4070 which is too expensive atm at least if the 4060 TI 16gb had x16 lanes as im using an older motherboard, tempted to get something from AMD with 16g of vram but im using some nvidia features and not sure if i can live without them at the moment.

For those with RT 3070 would be great if there were some 2Gig vram chips with the same timings as current chips so you can swap them (or some bios editor to change the timings after swaping).
I think part of the issue is if these mid range cards were low range pricing aka about $150-250, then one could excuse low VRAM, but they not, they at high end pricing, or even close to former flagship pricing in some cases, at that price point having that amount of VRAM is inexcusable.

I am in same boat, I use SGSSAA on old games I like to keep playing again and again, and I dont know of a way to utilise that on AMD cards, if was possible I would be already jumped over.

Nvidia better software, AMD better hardware?
 
I know what you mean I was unsure what to do when my 3070s 8gb limitation became very obvious trying to play tlou. I ended up buying a 4090 because I am irresponsible and didn't want to lose dldsr... but even as somebody whos always used nvidia cards I have to admit amd is a better value proposition right now.




I'm pretty sure I read about people doing just that before, but I'm sure its extremely difficult to do.
I could afford a 4070 but it doesnt make sense to me to get one at the current prices here as im only playing games time to time but when i do i like to play them at a good quality without having issues, for working purpose my 2060 super is more than enough and i was just thinking to get a newer card for the new features (av1 etc) and move the 2060 super to my other system that doesnt have one , 3070 was my first option as i can get one at a good price but not sure if it makes sense to get a 8GB card now.

I was thinking at 3060 TI 16G too but i have to change my motherboard to one with PCIE4.0 (and it doesnt make sense to go from AM4 to AM4) and if going to AM5 basically i have to build a new computer.

I dont think is that hard to swap the chips , you will still need to go to an specialist for it but i think the only problem is that the vram timings will be wrong for the new chips and without the possibility to edit the timings in bios you will get all sort of artifacts/ bugs.
 
Looking at all them charts, the 3080 10gb seems to be holding up pretty well still.
 
Im hearing that there will be RTX 40 series refresh , not sure if its true or not but sounds interesting, maybe the 4060TI 12GB will use x16 (i see its using AD104) and will have a decent price or the current 4070 will drop in price more.

1697454436393.png
 
Im hearing that there will be RTX 40 series refresh , not sure if its true or not but sounds interesting, maybe the 4060TI 12GB will use x16 or the current 4070 will drop in price more.

View attachment 317708

Thought they migh have upped the ram on the 4090ti
 
i dont really see it as a issue after all if the gpu uses all its vram it just steals it from the system its no prob if your systems ram is not too slow. i just brought a 3080 with 10 gb vram its more than enough in my view to last me 3 or 4 years.
 
Low quality post by Dragam1337
I was looking at old videos and found where I was playing Guild Wars 2 back in 2013 and barely using 300 MB of VRAM, at 5040x900 resolution.


I think I was using either a Radeon HD 7850 GPU, or R9 380 at the time, and I recall GW2 being one of the best-looking games (or at least MMORPG) at that time. It must have been relatively optimized too for me to be pulling off 50 FPS :p

What's with some game requirements nowadays requiring 8GB or more VRAM for 1080p? I've heard a Plague Tale and some Jedi game being big offenders.

"WHY DOES MORE DEMANDING GRAPHICS REQUIRE BETTER HARDAWARE?!?!?!?!111"
 
"WHY DOES MORE DEMANDING GRAPHICS REQUIRE BETTER HARDAWARE?!?!?!?!111"
We all know that isnt it though.

FF7 remake came out with PS3 quality textures and is a VRAM guzzler.

Likewise can mod old games with extremely nice textures and they not suddenly requiring over 8 gigs of VRAM.
 
HUB did some good work in the past. I cant watch them anymore. For an unbiased publication, they sure seem to be biased.. with the hood pulled down and a scowl.. bah.
 
We all know that isnt it though.

FF7 remake came out with PS3 quality textures and is a VRAM guzzler.

Likewise can mod old games with extremely nice textures and they not suddenly requiring over 8 gigs of VRAM.

Amazing argumentation - one remake has poor textures, so no game has better graphics than 10 years ago... right...

And graphics is alot more than textures, genius.
 
Amazing argumentation - one remake has poor textures, so no game has better graphics than 10 years ago... right...

And graphics is alot more than textures, genius.
Its an example not an isolation.

Historically textures have been the biggest guzzler of VRAM. I explained in an earlier reply in this thread why I think recent titles have a dramatic increase in VRAM requirements.
 
We all know that isnt it though.

FF7 remake came out with PS3 quality textures and is a VRAM guzzler.

Likewise can mod old games with extremely nice textures and they not suddenly requiring over 8 gigs of VRAM.
Having PS3 quality textures on threefolds the PS3 object budget would naturally demand more memory.

Modding old games with 4 textures would not result the same as a 4k'ed modern game because:
A- They don't employ the same or similar rendering piplines.
B- As implied above, they don't use the same amount of textures on the same frame.
 
Having PS3 quality textures on threefolds the PS3 object budget would naturally demand more memory.

Modding old games with 4 textures would not result the same as a 4k'ed modern game because:
A- They don't employ the same or similar rendering piplines.
B- As implied above, they don't use the same amount of textures on the same frame.
Amount of textures is a reasonable point.

I suppose i need to say it again as you guys probably skimped over my earlier post, there is an issue with developers putting things that historically would go into RAM into VRAM instead, I dont know if thats what you meant by the rendering pipeline differences.
 
there is an issue with developers putting things that historically would go into RAM into VRAM instead,
Care to give examples?
 
Care to give examples?
New far cry game moved things into RAM from VRAM to fix some issues, But mostly its basic observation, consoles have unified memory architecture and then seeing excessive VRAM usage combined with quite low RAM usage. Putting two and two together.

I might be wrong of course, but I have seen an article saying the same thing, which adds a bit more to it.
 
and its probably gonna get worse with vram needed, especially when you see those half baked unoptimized releases you get for paying +50$/E..

@AusWolf
while it might work with some stuff, even my "limited" pc use (vs some others) has shown "you" might be forced to use the latest.
when your vid editing sw is between 2-10x slower because its to old to know to use the dgpu (instead of cpu),
its not really an option anymore, to use a previous version..

and i say that, playing win98/xp games on 10, so im not "only use the latest" kind of person :D
 
New far cry game moved things into RAM from VRAM to fix some issues, But mostly its basic observation, consoles have unified memory architecture and then seeing excessive VRAM usage combined with quite low RAM usage. Putting two and two together.
I'll have to look into the Far Cry case. But as far the unified memory is concerned, I don't think it matters at all.
Passing things to VRAM requires explicit interaction with the driver, typically through one of the popular APIs.
To store, say, NPC AI (a CPU/main memory task) parameters in VRAM, would mean the developer chose to and spent time writing code . And I doubt anyone would waste effort on something so counter intuitive.
Prevalence of non-title specific engines (which abstracts much of memory management) makes these even less of a possibility, imo.
 
Im hearing that there will be RTX 40 series refresh , not sure if its true or not but sounds interesting, maybe the 4060TI 12GB will use x16 (i see its using AD104) and will have a decent price or the current 4070 will drop in price more.

View attachment 317708
RAAAH. Super Ti RTX Ultimat0rzz
Sorry. Couldnt help mself
 
If I had to guess, games can swap assets on the fly to minimize the perf hit when there's not enough VRAM available in rasterized titles, but with RT enabled, the GPU needs access to all the data all at once to calculate the light bounces. And pulling from system memory slows things to an absolute craaawl.

Upscaling can help all the crucial data fit into a smaller framebuffer.

Well let's examine the premise that RT is the root cause of the 3070 FPS issues. Let's take a look at RT in Shadow Of The Tomb Raider, here is the memory usage profile of the game: (Source Guru3D, note that these figures are max settings without Ray Tracing enabled as I was not able to find a chart with RT. TPU didn't start doing it's VRAM charts with RT until more recently. Figure below is maximum VRAM usage throughout 3 runs):

1697477312853.png

Now a 2060 6GB should immediately cause issues if your theory is correct as given the chart we know the game utilizes more than the card has even without Ray Tracing enabled. Looking at performance though, we can see it's fine:

Clipboard Image.jpg


Cards here including the 2060 are scaling according to their power and as you can see the dip is consistent between all cards when comparing RT ultra vs RT off.

Even if we assume that RT needs access to all it's associated data on the VRAM, it doesn't explain why the game wouldn't just put that high priority data on the VRAM and move low priority data to the main system memory, particularly when you are talking about a mere 35 MB of data in the case of Far Cry 6. Again to me it seems like a bug or extreme oversight by Ubisoft because there's no logical reason for it. You would not put your highest priority data on the main system memory when you can just boot lower priority textures down instead, of which you can just stream in without much of a performance impact. I would normally present more examples but it's surprisingly difficult to find VRAM usage for games with RT enabled while them also including cards with too little VRAM like the 2060 in the RT results. Even TPU's own charts tend to exclude the lower VRAM cards when it comes to their RT charts.

Those reviews all have RT disabled though.

It was to demonstrate the fact that you can go over your VRAM limit without dropping to 1/5th of your FPS. If we are testing the hypothesis that VRAM is the cause, the same game should exhibit the same behavior on other GPUs, particularly one's that are more deficient.
 
I have 20GB so no they are not a joke and it don't effect me.
 
Is utilising system ram as an overflow area an automatic thing on graphics API's or do dev's have to implement it (this was suggested to me in an earlier reply), as we have seen some games when out of VRAM or low on VRAM have visual artefacts instead of a performance drop off like streaming LOD staying at low LOD, or textures not loading at all. That might explain why performance doesnt necessarily drop off but instead you just dont have the proper visuals.

Also DF noted that when games stutter, it isnt necessarily picked up by benchmarking measuring tools, this is something they observed in FF7 remake when their competitors were reporting smooth performance on a stutter fest game and they knew it was wrong just from eye balling it.
 
Well let's examine the premise that RT is the root cause of the 3070 FPS issues. Let's take a look at RT in Shadow Of The Tomb Raider, here is the memory usage profile of the game: (Source Guru3D, note that these figures are max settings without Ray Tracing enabled as I was not able to find a chart with RT. TPU didn't start doing it's VRAM charts with RT until more recently. Figure below is maximum VRAM usage throughout 3 runs):

View attachment 317770
Now a 2060 6GB should immediately cause issues if your theory is correct as given the chart we know the game utilizes more than the card has even without Ray Tracing enabled. Looking at performance though, we can see it's fine:

View attachment 317772

Cards here including the 2060 are scaling according to their power and as you can see the dip is consistent between all cards when comparing RT ultra vs RT off.

That first chart is showing VRAM allocation. Games almost always allocate more than the bare minimum required to run smoothly. In the Far Cry 6 example, it's allocating over 10GB on the 6700XT, but still able to function within the 8GB memory pool of the 3070 once FSR is enabled. We have no way of knowing exactly how much crucial date is spilling over.

Even if we assume that RT needs access to all it's associated data on the VRAM, it doesn't explain why the game wouldn't just put that high priority data on the VRAM and move low priority data to the main system memory, particularly when you are talking about a mere 35 MB of data in the case of Far Cry 6. Again to me it seems like a bug or extreme oversight by Ubisoft because there's no logical reason for it. You would not put your highest priority data on the main system memory when you can just boot lower priority textures down instead, of which you can just stream in without much of a performance impact. I would normally present more examples but it's surprisingly difficult to find VRAM usage for games with RT enabled while them also including cards with too little VRAM like the 2060 in the RT results. Even TPU's own charts tend to exclude the lower VRAM cards when it comes to their RT charts.



It was to demonstrate the fact that you can go over your VRAM limit without dropping to 1/5th of your FPS. If we are testing the hypothesis that VRAM is the cause, the same game should exhibit the same behavior on other GPUs, particularly one's that are more deficient.

And yet, the 4060 Ti 16GB dusts its 8GB brother (and all other 8GB cards for that matter) in FC6 with RT. It's really one of the clearest examples there is.

rt-far-cry-6-3840-2160.png
 
Back
Top