• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Enough VRAM doesn't mean the GPU will be strong enough to play with them, so when you turn off a few settings, VRAM and GPU should be ok.
Right, it depends on the gpu class and game settings, you can get away with 12gb for sure with dlss and lower textures , when i said 16gb i was thinking at 1440p native , max settings or at least max textures, you can be limited in some games.
 
VRAM size is also meaningless for us 144 Hz gamers as the settings at which the GPU has enough edge for such a framerate are VERY far away from VRAM size limitations in, like, 99.9% titles. Textures alone don't tank that much, unless they average 8K (why?) or compression sucks (then it's a reason to pound on the game dev's door rather than spend more quid on VRAM).

Sure, I won't buy a GPU with less than 16 GB VRAM but it's only because I'm not gonna upgrade till I can afford a GPU 3+ times faster than mine. And by that time, 16 GB will be something like what 8 GB is today. If I were in the market for a GPU today I'd basically pick a used 3080, it'll play 1080p144 just fine for the rest of 2020s.
 
For 1440p? At any detail setting (including low)? I'd say 8 GB is a must, but one should be aiming for 12 when buying a new card in 2025. 16 GB is nice to have, but I wouldn't say it's absolutely necessary.
 
For 1440p? At any detail setting (including low)? I'd say 8 GB is a must, but one should be aiming for 12 when buying a new card in 2025. 16 GB is nice to have, but I wouldn't say it's absolutely necessary.

i guess we have to be very specific. I was thinking medium settings, i assume that's the best aim as a metric.
There is no need for ultra and high can be a plus but not necessary, but low can be very disappointing, like watching the avengers from marvel in the advertising, and getting the Abenters from Barvel.
 
i guess we have to be very specific. I was thinking medium settings, i assume that's the best aim as a metric.
There is no need for ultra and high can be a plus but not necessary, but low can be very disappointing, like watching the avengers from marvel in the advertising, and getting the Abenters from Barvel.
That's highly dependent on the game, but yeah, we have to be specific. There's no such thing as "how much VRAM do I need in 2025". For what?
 
Recommended Vram in 2025-2027 (RT + PT + DLSS/FSR FG + DLC HD textures 4k);

1080p = 10-12gb
1440p = 12-20gb
2160p = 20-32gb

IMHO.... !
 
Recommended Vram in 2025-2027 (RT + PT + DLSS/FSR FG + DLC HD textures 4k);

1080p = 10-12gb
1440p = 12-20gb
2160p = 20-32gb

IMHO.... !
Too bad only the 5090 has more than 16GB vram, 7900XT/XTX is really weak for PT+RT. In Indianna Jones RT 1440p where you need vram, they perform like a 12GB nvidia gpu.
 
Too bad only the 5090 has more than 16GB vram, 7900XT/XTX is really weak for PT+RT. In Indianna Jones RT 1440p where you need vram, they perform like a 12GB nvidia gpu.
As a DeFacto AMD Fan, I can't entirely disagree with this.

At best, the 7900 XTX performs in RT like a 3090 or 4070 Super

relative-performance-rt-2560-1440.png
relative-performance-rt-3840-2160.png


The RX 9070 XT (OC) OtOH, performs like another tier higher. 3090Ti~4070Ti RT performance; +15 - +18% over the XTX.
I'm probably switiching to a 9070 XT 16GB because, it's modestly better in high-res RT, over the 24GB 7900 XTX
relative-performance-rt-2560-1440.png
relative-performance-rt-3840-2160.png
 
As a DeFacto AMD Fan, I can't entirely disagree with this.

At best, the 7900 XTX performs in RT like a 3090 or 4070 Super

relative-performance-rt-2560-1440.png
relative-performance-rt-3840-2160.png


The RX 9070 XT (OC) OtOH, performs like another tier higher. 3090Ti~4070Ti RT performance; +15 - +18% over the XTX.
I'm probably switiching to a 9070 XT 16GB because, it's modestly better in high-res RT, over the 24GB 7900 XTX
relative-performance-rt-2560-1440.png
relative-performance-rt-3840-2160.png

Whilst I've sticked up to Nvidia because of RT it seems AMD is finally catching up which it's a good thing. No competittion means monopoly. Monopoly means worse prices for consumers and.. companies get complacient. I hope AMD tears a new one on to Nvidia. I was excited for 50 series to upgrade my 3080 but it seems I'm gonna pass or simply grab a 4080/4090 and call it a day.
 
Whilst I've sticked up to Nvidia because of RT it seems AMD is finally catching up which it's a good thing. No competittion means monopoly. Monopoly means worse prices for consumers and.. companies get complacient. I hope AMD tears a new one on to Nvidia. I was excited for 50 series to upgrade my 3080 but it seems I'm gonna pass or simply grab a 4080/4090 and call it a day.
You want AMD to stick up against a monopoly, but you'll buy a 4090? Where's the logic?
 
You want AMD to stick up against a monopoly, but you'll buy a 4090? Where's the logic?

Cuda is a standard. I went against my better judgement buying Nvidia for quite the amount of reasons. If I could move all my workload to not only lInux but to rely on OpenCL and the other stuff that AMD uses then I'll happily move every rig to AMD.
 
With 250-300W, 5070 looks like 3090. But with 12GB, also looks like 3070. But gflops looks like 3080. But raster looks like 3090. But cooler block looks like 3060ti. Less physx than gtx580, but more expensive than 4070 ti. What a card. Its all over the place.
 
Look here. 3070 8GB outperforms 6700XT 12GB with ease in new games at 4K/UHD in terms of minimum fps. Minimum fps would drop very low if VRAM was an issue -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
Is just a bad comparison which actually tell us that 12GB @192 bit doesn't suffice for 4K, 6700XT memory bandwidth of 384 GB/s and 3070 has a memory bandwidth of 448 GB/s. 192bit vs 256 bit, 6700XT is not a 4k card it will struggle. Is 3070 a 4K card, maybe but not a very smooth experience. You compared apples with pears in same basket in but, are in different gardens.

I get your point, is marketing and hysteria, and are caps on VRAM usage in the drivers... lazy developer are at the core of this SS.

The difference between high and ultra textures is often just slightly less compression (sometimes ultra is uncompressed), which you most of the time won't notice when actually playing the game. Dropping texture quality to low and sometimes medium can be seen easily, but high and ultra mostly looks identical, especially without 300% zoom and in motion.
Yeah, lazy coders but, just those lc will make a lot of games to eat too much CPU, too much VRAM and not gonna cope with 128 bit ones either even if are 16Gb of it.

You have to be logical. Game developers knows that the majority of PC gamers don't have more than 8GB.
Is nothing logical around lazy codes, "Game developers knows" and can't give a rat ass if you have to lower your settings in order to play that game on 8GB card.
Form their point of view you can play the game on 8gb card just not so eye candy.

For example War Thunder didn't drop yet ultra low settings so you can play the game with a 960 card, because is a "free" game? they want huge player base from which a % will cash in.
They are heroes, nope. Just an example why a dev will support low end GPUS.
Why a lazy developer of triple A game will do that? Why would give you the chance to play with your 6Gb card and work hard for that? Popularity, fan loyalty forgotten criteria not exceeding 1 % brain activity of a developer.
Just because is possible to run well coded games in 2k and 4k with 8Gb VRAM doesn't mean it will be more than 2% of such games.

Same story goes for the space that games requires now days.
 
Back
Top