• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

for those who think 12gb vram can max out everything

By that point both cards will be in unplayable territory and I am not sure about you, but I personally have no interest in discussing the merits of two different piles of shite. Oh look, this one is slightly less lumpy than that one, very cool.
The NV pricing makes their cards look crappy. Bought the 4070 Super some months back NIB at 681$ tax included. The RX 6800 is far better bang for the buck in my opinion.
 
The NV pricing makes their cards look crappy. Bought the 4070 Super some months back NIB at 681$ tax included. The RX 6800 is far better bang for the buck in my opinion.
If NVIDIA cards were cheaper then the 4070s would be a favorable buy, but that doesn't really say anything about VRAM and whatever OP is spewing. I do miss my RX6800 and 6800XT though..
 
like a flat earth stooge.
These lads even have more credibility because they at least witness the flatness of the visible part of the surface of this globe. Short-visioned, sure, yet they still have something to say.

However, OP has never seen any of GPUs they're discussing. Would've posted different messages otherwise. Unless this thread is trololo (can't ignore this possibility!)

The RX 6800 is far better bang for the buck in my opinion.
Especially if we go aftermarket where it cost $280ish and murders everything in FPS/$.
 
The GTX980, GTX1660 and RTX2060 would like to have a word. All three of those cards are still a fair shout for gaming. Remember, just because it's older, doesn't mean it's unusable.
GTX680 4GB. GTX780 6GB. The list can go on and on.

Actually got some recent games running on the 680 2GB. Given it was a bit of a texture mess it still ran. Mostly
 
If you look at the recommended specs for modern games on Steam, the requirements are not that steep for 99% of games.

But when you want hundreds of FPS at high refresh and res, that is where you start to pay.

For instance, I play on a TV, so 4K60 is totally ok with me, my TV is 60Hz anyways.

My kid plays at 1080p/60 on an 8GB card and he loves it. Looks pretty good to me to.
 
Downgraded from 12GB to 10GB and I still play at 4K.
 
If you look at the recommended specs for modern games on Steam, the requirements are not that steep for 99% of games.

But when you want hundreds of FPS at high refresh and res, that is where you start to pay.

For instance, I play on a TV, so 4K60 is totally ok with me, my TV is 60Hz anyways.

My kid plays at 1080p/60 on an 8GB card and he loves it. Looks pretty good to me to.
And the Steam Deck has 4 CPU cores, an 8 CU iGPU, and no VRAM, and I love mine to bits! :)

Okay, the Steam Deck was a radical example, but I tend to say that GPUs didn't get so much more expensive, but our expectations grew bigger, too.
 
And the Steam Deck has 4 CPU cores, an 8 CU iGPU, and no VRAM, and I love mine to bits! :)

Okay, the Steam Deck was a radical example, but I tend to say that GPUs didn't get so much more expensive, but our expectations grew bigger, too.
I get it man :)

I see people smack talk 8 and 12GB cards on the daily it seems.. meanwhile over here we are enjoying the heck out of them.

Can you do better? Sure..
 
News flash!!!!

The thing that isn't the best, isn't the best.

If your needs are met by the amount of frame buffer you have, then roll with it. If they aren't, then you better buy a card with more.
 
Or turn on DLSS and Frame Gen haha :laugh:
 
GTX680 4GB. GTX780 6GB. The list can go on and on.
Those cards really are too old to be relevant for modern gaming.
Actually got some recent games running on the 680 2GB. Given it was a bit of a texture mess it still ran. Mostly
Fair enough. Still, those really are too old for games made recently.

If you look at the recommended specs for modern games on Steam, the requirements are not that steep for 99% of games.
Fair point and the same is true for Epic and GOG.
 
Downgraded from 12GB to 10GB and I still play at 4K.
I was playing at 4K with my 3070Ti for like 2 years lol. It was not terrible, but it wasn't awesome :D

But the 12GB and extra horsepower is welcomed in comparison :laugh:
 
Any talks about Vram are often seen with the scope of "future-proofing". 12GB might be fine now, but it's too close to the limit to give peace of mind. Assuming that Vram requirement keeps growing in the next few years, and you'll have issues with textures/assets loading. From my personal exp, there's only a handful of games where 8GB is causing problems and even fewer games where fixing those issues would result in a noticeable downgrade in texture quality. "my 2020 GPU can only handle early 2020 level of textures in 2025, yuk"

Then there's games like the last of US who where really unpolished at launch. They somehow managed to make better-looking textures that used less VRAM in later patches :eek:
View attachment 380589
But both AMD and Nvidia are looking at neural compression/decompression of textures to lower the impact on VRAM, FG is also getting optimized to lower the memory footprint.
Which one has "depth of field" enable or at ultra/ high, because that matters as many games have it heavily over done now.
 
Those cards really are too old to be relevant for modern gaming.

Fair enough. Still, those really are too old for games made recently.


Fair point and the same is true for Epic and GOG.
I made the 680 play Once Human successfully. Iunno about you but if it works it works.
 
I guess people still ignore the fact that a lot of games today can swap textures dynamically and progressively depending on the length of gaming session with lower quality ones. Looking at framerate or frametime doesn’t always reveals the truth.

Hey my frame buffer is at 7.5/8GB full
I’m good…
In the mean time:

IMG_8553.png
 
FYI, a 4090 has the performance to play some games at 8k. Sadly, there are many occasions that it just runs out of vram and the game either hardcrashes or stutters like crazy.
 
If your needs are met by the amount of frame buffer you have, then roll with it. If they aren't, then you better buy a card with more.
Least we forget that out of the blue, a bottom of the barrel shows-up with a truckload of framebuffer with a very narrow practical use, so expect the following at some point in the future: nVidia GeForce RTX 5050 24GB (GDDR6 - 384-bit) (GB207)
 
I'm genuinely bewildered at the industry gaslighting surrounding this stuff.

"You need to upgrade your $500 GPU to a $1,000 GPU or your wall signs will look like this in our premium $60 modern AAA games. This is totally normal, games have always looked this way on anything less than 6GB VRAM..."

...vs...

This and this are what wall signs / displays can look like on a 27 year old game engine running at at 300fps on $70 2GB VRAM GPU's with a few hours of work by one modder in his spare time who doesn't even work in the game industry...

^ This "You need a $2,000 GPU or you deserve to have abnormally ugly games, because we refuse to do absolutely any optimisation at all anymore, and that's all your fault gamers" gaslighting is insane. Looking like this shouldn't even be a thing, not even on low on 4GB VRAM GPU's when wall posters in 1GB VRAM games like Bioshock (2007) didn't look anywhere near that bad 18 years ago (that's the original DVD-ROM version not the remaster). It's not the hardware that's "broken" here, it's the new game engines that are so overweight and overly-convoluted that 90% of game developers who license them have absolutely no clue how to use them anymore, so bad port after bad port is churned out with a concerted dishonest effort to normalize that almost psychotic level of staggering incompetence and quite literally brainwash and victim blame "Real Gamers (tm)" into thinking it's somehow "normal" and their fault for not buying 4-digit priced GPU's. Bullshit. 2000's titles turned down to Med, etc, preset very definitely didn't run 10x slower and look 10x worse at the same time vs 90's titles as some junk we're seeing today does vs 2010's games. For those who've forgotten, many UE3 titles actually ran very well on High on low-end hardware of the day without ending up an ugly smear-fest. "We can't get our games to either look right or run properly anymore" is definitely a new thing that reared it's ugly head with UE4/5 post 2015. What's needed more than ever is one almighty crash in the AAA gaming industry. Then you'll start to see better games on all GPU's, not just more manufactured artificial software degradation abused as a flagship sales pitch.
 
$70 2GB VRAM
You must be clinically insane to pay this much for a GPU like that. $70 buys you a 1060 6 GB.

But all in all, sure, AAA game devs dying all at once would be a piece of really good news. What they are doing to the gaming performance is inexcusable.
 
You must be clinically insane to pay this much for a GPU like that. $70 buys you a 1060 6 GB.
I was talking new prices back then, eg, GT 1030 8 years ago. It's absurd that textures in decade old games can still look better on decade old GPU's than half the "Low/Preset" textures to day on today's games can look on today's hardware. That screenshot "needing" 8GB VRAM is a joke. It has to be fake. It's almost maliciously intentionally "one of our competitors has infiltrated our company and is trying to sabotage us" level of bad.

Edit: Here's Everybody's Gone To the Rapture using 5GB VRAM @ 4k. This was a low-budget 4hr long Indie WALKING SIMULATOR released in 2015. Compared to looking like this on a 8-12GB VRAM GPU, the rat-race isn't even sane anymore.

Or what STALKER 2 looks like on Low at 1440p running at 96fps on a 285w RTX 4070Ti (nice smeary as hell barely readable sign on the left desk there) vs what Deus Ex Human Revolution looks like at 1440p running at +60fps on a 15w 7840U AMD APU (an almost 20:1 ratio disparity of hardware horsepower). What the hell went wrong?...
 
Last edited:
I'm genuinely bewildered at the industry gaslighting surrounding this stuff.

"You need to upgrade your $500 GPU to a $1,000 GPU or your wall signs will look like this in our premium $60 modern AAA games. This is totally normal, games have always looked this way on anything less than 6GB VRAM..."

...vs...

This and this are what wall signs / displays can look like on a 27 year old game engine running at at 300fps on $70 2GB VRAM GPU's with a few hours of work by one modder in his spare time who doesn't even work in the game industry...

^ This "You need a $2,000 GPU or you deserve to have abnormally ugly games, because we refuse to do absolutely any optimisation at all anymore, and that's all your fault gamers" gaslighting is insane. Looking like this shouldn't even be a thing, not even on low on 4GB VRAM GPU's when wall posters in 1GB VRAM games like Bioshock (2007) didn't look anywhere near that bad 18 years ago (that's the original DVD-ROM version not the remaster). It's not the hardware that's "broken" here, it's the new game engines that are so overweight and overly-convoluted that 90% of game developers who license them have absolutely no clue how to use them anymore, so bad port after bad port is churned out with a concerted dishonest effort to normalize that almost psychotic level of staggering incompetence and quite literally brainwash and victim blame "Real Gamers (tm)" into thinking it's somehow "normal" and their fault for not buying 4-digit priced GPU's. Bullshit. 2000's titles turned down to Med, etc, preset very definitely didn't run 10x slower and look 10x worse at the same time vs 90's titles as some junk we're seeing today does vs 2010's games. For those who've forgotten, many UE3 titles actually ran very well on High on low-end hardware of the day without ending up an ugly smear-fest. "We can't get our games to either look right or run properly anymore" is definitely a new thing that reared it's ugly head with UE4/5 post 2015. What's needed more than ever is one almighty crash in the AAA gaming industry. Then you'll start to see better games on all GPU's, not just more manufactured artificial software degradation abused as a flagship sales pitch.

it's 2025, release the game now, patch later, optimise eventually or never. And RTX baked into the games is only going to make things worst in terms of performance and vram usage.
And we don't see that much visual improvement, in fact it's easy to find many cases were more performance, more vram, and games look much worst. And even in terms of gameplay, mechanics, facial expressions, many games are getting worst and very few are getting better. And we can only imagine how much worst things could be if they weren't forced to optimize for the consoles with limited hardware.

And don't get me started on the rays and the shiny floors.
 
Which one has "depth of field" enable or at ultra/ high, because that matters as many games have it heavily over done now.
That's not DOF, that game at launch was notorious for having a massive texture degradation, with medium looking as good as a 1997 3D game
1737369199776.png
 
I'm genuinely bewildered at the industry gaslighting surrounding this stuff.

"You need to upgrade your $500 GPU to a $1,000 GPU or your wall signs will look like this in our premium $60 modern AAA games. This is totally normal, games have always looked this way on anything less than 6GB VRAM..."

...vs...

This and this are what wall signs / displays can look like on a 27 year old game engine running at at 300fps on $70 2GB VRAM GPU's with a few hours of work by one modder in his spare time who doesn't even work in the game industry...

^ This "You need a $2,000 GPU or you deserve to have abnormally ugly games, because we refuse to do absolutely any optimisation at all anymore, and that's all your fault gamers" gaslighting is insane. Looking like this shouldn't even be a thing, not even on low on 4GB VRAM GPU's when wall posters in 1GB VRAM games like Bioshock (2007) didn't look anywhere near that bad 18 years ago (that's the original DVD-ROM version not the remaster). It's not the hardware that's "broken" here, it's the new game engines that are so overweight and overly-convoluted that 90% of game developers who license them have absolutely no clue how to use them anymore, so bad port after bad port is churned out with a concerted dishonest effort to normalize that almost psychotic level of staggering incompetence and quite literally brainwash and victim blame "Real Gamers (tm)" into thinking it's somehow "normal" and their fault for not buying 4-digit priced GPU's. Bullshit. 2000's titles turned down to Med, etc, preset very definitely didn't run 10x slower and look 10x worse at the same time vs 90's titles as some junk we're seeing today does vs 2010's games. For those who've forgotten, many UE3 titles actually ran very well on High on low-end hardware of the day without ending up an ugly smear-fest. "We can't get our games to either look right or run properly anymore" is definitely a new thing that reared it's ugly head with UE4/5 post 2015. What's needed more than ever is one almighty crash in the AAA gaming industry. Then you'll start to see better games on all GPU's, not just more manufactured artificial software degradation abused as a flagship sales pitch.
That's because our focus has shifted.

It shifted away from textures to models, tessellation, lighting, shadows, and last but not least... monitor resolution. Everybody is throwing the term 4K around like someone paid them for it. But who needs 4K? If I hadn't been curious of an ultrawide aspect ratio, I'd still be on 1080p. Well, actually, the dpi on my my 34" 1440 UW is not far off from the 24" 1080p that I had before. But this is the maximum I'm willing to go up in resolution. If I want more clarity on the same size, I can turn on AA or VSR (but I don't need to), and I don't have desk space for anything bigger. Chances are I'll probably never buy into 4K because I don't need to.

And if we're talking about low resolutions, who's got a handheld console who doesn't like playing games on it? Nobody? :rolleyes: That's right! Handhelds remind me of a simpler time when the amount of VRAM on your GPU and the number of pixels on your screen weren't the only thing that mattered. We've turned gaming into some sick wallet measurement contest, and this has to stop. Like you said, AAA has to die.
 
Back
Top