• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Explains GeForce RTX 40 Series VRAM Functionality

A pipedream in the current market for new GPUs, that's barely achievable buying used, last-gen AMD cards on ebay.

That is, however, what I did. I bought a used 6800XT and 6700XT at the prices you suggested. The 3070 was replaced by a 6800XT, and whilst the 3060 at least had enough VRAM it was hot and slow (Thanks, Samsung 8nm!)
In this moment stay close specially with rx 6700 no xt 10gb around 270us

This greedy companies need low prices because stock will be huge

And on recesion most countries in world, buy a videocard is more a luxury than a something essential

Nvidia will be cut prices because shitty excuses dont give a chance for give more money to lack of features product like rtx 4060

:)
 
What's the point of better-than-expected performance if you're forced to run low settings due to VRAM limitations?

it would make the 16GB variant more viable if the performance is closer to the 4070 than I expect it to be ....

Let not kid ourselves though it's not like AMD has even showed up in this segment they skipped it all the way down to the 300 option with an also overpriced 8GB card just like the 4060 that has less vram than it's predecessor
 
at this point the vram discussion is more like a shouting match, lots of unreasonable claims, everyone has their own opinion, and me i'm still waiting for reasonable tests in reasonable scenarios. This should be a 1440p card to be use with medium settings, and that's what i want to see. Not 1080p (even if anyone can use it for that for sure), not ultra, not RT (no one is actually taking you seriously RT), not 4k, not tested with a 4090 pretending to be a 4060.
The thing that's most offensive here is that Nvidia now wants to call a $400 GPU "1080p" in 2023.

The 1660 super, a 4-year-old that cost $250 in 2019 card still breezes "1080p" in 2023.

The 4060Ti is so much more capable than 1080p, but it's crippled by that 8GB. Nvidia admitted as much by failing to match graphics settings in their own cherry-picked benchmarks.
 
The thing that's most offensive here is that Nvidia now wants to call a $400 GPU "1080p"

The 1660 super, a 4-year-old $250 card still breezes "1080p" in 2023.

the goal post is always moving forward, tomorrow games at 1080p will always demand more power. But that said resolutions are also being dropped, 1080p is a entry level now, barely anyone is using 720p anymore. Calling a 4060 card a 1080p card in 2023 doesn't sit right with me at all, especially at that price. But what do i know.
 
the goal post is always moving forward, tomorrow games at 1080p will always demand more power. But that said resolutions are also being dropped, 1080p is a entry level now, barely anyone is using 720p anymore. Calling a 4060 card a 1080p card in 2023 doesn't sit right with me at all, especially at that price. But what do i know.

Nvidia themselves has said the 4060ti 8GB is targeting 1080p gaming.
 
the goal post is always moving forward, tomorrow games at 1080p will always demand more power. But that said resolutions are also being dropped, 1080p is a entry level now, barely anyone is using 720p anymore. Calling a 4060 card a 1080p card in 2023 doesn't sit right with me at all, especially at that price. But what do i know.
Yep

xx60 class has always represented "the sweet spot" and for gamers, the sweet spot moved on from 1080p60 a long time ago.
IMO, the sweet spot has been 1440p high refresh with VRR for years now. You don't need to always get >144 fps but an average of ~90fps with 1% lows of over 60 is a good place to be.
 
Last edited:
Yep

xx60 class has always represented "the sweet spot" and for gamers, the sweet spot moved on from 1080p60 a long time ago.
IMO, the sweet spot has been 1440p high refresh with VRR for years now. You don't need to always get >144 fps but an average of >100fps with 1% lows of over 60 is a good place to be.

idk 1440p 144hz with non peasant settings is still pretty high end..... While I wouldn't disagree it's what people should be targeting most can't afford it.

average-fps-2560-1440 (1).png
 
Nvidia themselves has said the 4060ti 8GB is targeting 1080p gaming.

they would say a cow is a bird if it was good for their bottom line
 
they would say a cow is a bird if it was good for their bottom line

Saying a 400 usd 4060ti is for 1080p honestly just makes them look bad.
 
Saying a 400 usd 4060ti is for 1080p honestly just makes them look bad.

they just double down on the 8GB things so it isn't off character for them, they choose this hill to die on, apparently. Now they can only just go forward full gas. smh
 
idk 1440p 144hz with non peasant settings is still pretty high end..... While I wouldn't disagree it's what people should be targeting most can't afford it.

View attachment 296559
Read the rest of my quote. VRR means that you don't have to hit the vsync. I'm saying that 90-100fps is a great experience good enough for just about everything outside of competitive esports.
Sure you can spend more for truly high refresh, but 1440p at "better-than-60Hz" seems to be the new sweet spot as of ~2021 or something like that.

1684447121906.png
 
Read the rest of my quote. VRR means that you don't have to hit the vsync. I'm saying that 90-100fps is a great experience good enough for just about everything outside of competitive esports.

View attachment 296562

I agree, that's way more realistic for most gamers.

1440p 80-100fps. 144hz is still a massive jump in cost.
 
I agree, that's way more realistic for most gamers.

1440p 80-100fps. 144hz is still a massive jump in cost.
and CPU starts to matter as framerates increase. A Ryzen5 or i5 can handle 100fps without dying.
 
and CPU starts to matter as framerates increase. A Ryzen5 or i5 can handle 100fps without dying.

I think a lot of gamers are actually targeting 60fps still to be honest though.
 
the other side of the vram 'issue' is lazy console devlopers
being that consoles are unfied memory the fast/cheap thing todo is just to cram all the assets into memory because its all one very fast pool of 16gb GDDR6 On a 256bit/320bit buss (no seperate 'ram' and 'vram' its all one segeragated pool

so come pc port time they don't bother to properly manage memory / i/o pressure and everything falls apart because they way they are handling assets is frankly inefficient

now nvidia knows this and they should have made the effort to ensure that 10GB was the minium
Not sure what you mean by lazy on the console dev's part. Putting as many assets into that unified memory is the most efficient way to use it. It would be a waste of console performance to build around the I/O and RAM limitations of a PC. It's the responsibility of either their internal or external PC port team to find out what works best on a PC platform. I agree that a 10GB/12GB minimum this generation would have been nice to see from Nvidia.
 
I think a lot of gamers are actually targeting 60fps still to be honest though.
and there's a $250 GPU market segment specifically for those gamers, full of decent options ;D
Jensen's lost touch with reality if he thinks people with a $250 budget suddenly have a $400 budget in the middle of the biggest cost-of-living crisis since the 1970's
 
I'm really feeling the love in this thread... :love:

Yeah, the thread has really brought up all the love that people have for Nvidia in here. As if AMD is any better.

Sad state this industry is in... that said while what Nvidia claims is true, it's just not justification for their abhorrent prices.
 
Yeah, the thread has really brought up all the love that people have for Nvidia in here. As if AMD is any better.

Sad state this industry is in... that said while what Nvidia claims is true, it's just not justification for their abhorrent prices.
AMD are charging as much as they can get away with too, but at this tier RT and frame-generation are of questionable value so why pay a premium for them?

The RX 7600 is likely to be better perf/$ not because AMD want it to be, but because their inferior RT and lack of FG mean they can't charge as much for it - Use that to your advantage!
 
Yeah, the thread has really brought up all the love that people have for Nvidia in here. As if AMD is any better.

Sad state this industry is in... that said while what Nvidia claims is true, it's just not justification for their abhorrent prices.

My brother is due for an upgrade and he is in the 5-700 usd range max. I feel bad for him because all the options in that price range are crap. I will probably pitch in so he can get a 7900XT I really dislike the 12GB on the 4070ti had Nvidia went with 16GB I would go with that instead for sure. Maybe the 4060ti 16GB will be better than I expect it to be but not holding my breath.
 
My brother is due for an upgrade and he is in the 5-700 usd range max. I feel bad for him because all the options in that price range are crap. I will probably pitch in so he can get a 7900XT I really dislike the 12GB on the 4070ti had Nvidia went with 16GB I would go with that instead for sure. Maybe the 4060ti 16GB will be better than I expect it to be but not holding my breath.

Yeah, stretching out another couple hundred bucks for the 7900 XT if possible seems generally sensible to me. I'm personally not sold on DLSS 3, I would maybe be more lenient with it if Nvidia didn't willingly withhold it from us 30 series owners, but I already tend to keep traditional DLSS off whenever possible, so frame generation couldn't possibly sway me either way.

AMD are charging as much as they can get away with too, but at this tier RT and frame-generation are of questionable value so why pay a premium for them?

The RX 7600 is likely to be better perf/$ not because AMD want it to be, but because their inferior RT and lack of FG mean they can't charge as much for it - Use that to your advantage!

RT is of questionable value, but frame generation is going to make or break these lower-end cards. Nvidia is fully accounting its frame generation technology into the general performance uplift and they strongly encourage you to enable it regardless of impact on image quality. Regarding Ada's lowest segments (such as 4050 mobile), you are essentially expected to use DLSS3 FG to achieve playable frame rates. Sucks to be you if the game you want to play doesn't support it, mail your dev requesting it or just don't be poor I guess.
 
My brother is due for an upgrade and he is in the 5-700 usd range max. I feel bad for him because all the options in that price range are crap. I will probably pitch in so he can get a 7900XT I really dislike the 12GB on the 4070ti had Nvidia went with 16GB I would go with that instead for sure. Maybe the 4060ti 16GB will be better than I expect it to be but not holding my breath.
IMO the 4070 is the least-bad, most-balanced option. Its mediocre performance/$ doesn't make it stand out in the market but it's efficient, supports all the latest features and is (just about) fast enough to get away with enabling them.

It's not great, but I don't think the 4070 is crap, simply because it's bringing lower power draw and newer features to an existing price point. Don't get me wrong, the 6950XT is faster in purely raster-based performance, but I think if you have this budget it's because you don't want purely raster-based performance: You want to move all the sliders to the right and tick all of the boxes in the options menu.

Yeah, stretching out another couple hundred bucks for the 7900 XT if possible seems generally sensible to me. I'm personally not sold on DLSS 3, I would maybe be more lenient with it if Nvidia didn't willingly withhold it from us 30 series owners, but I already tend to keep traditional DLSS off whenever possible, so frame generation couldn't possibly sway me either way.



RT is of questionable value, but frame generation is going to make or break these lower-end cards. Nvidia is fully accounting its frame generation technology into the general performance uplift and they strongly encourage you to enable it regardless of impact on image quality. Regarding Ada's lowest segments (such as 4050 mobile), you are essentially expected to use DLSS3 FG to achieve playable frame rates. Sucks to be you if the game you want to play doesn't support it, mail your dev requesting it or just don't be poor I guess.
There are, what, eleven games with DLSS3 FG so far?
It's RTX's launch all over again. By the time enough games support DLSS3 FG to justify buying a card on that feature alone, the 40-series is going to be as obsolete as the 20-series is now.
 
There are, what, nine games with DLSS3 FG so far?
It's RTX's launch all over again. By the time enough games support DLSS3 FG to justify buying a card on that feature alone, the 40-series is going to be as obsolete as the 20-series is now.

Agreed, and IMO those are nine games too much. The industry should have simply rejected such a blatantly one-sided, proprietary and elitist "tech" that they are shamelessly gating from their own existing customers in a shameless upsell.
 
IMO the 4070 is the least-bad, most-balanced option. Its mediocre performance/$ don't make it stand out in the market but it's efficient, supports all the latest features and is (just about) fast enough to get away with enabling them.

It's not great, but I don't think the 4070 is crap, simply because it's bringing lower power draw and newer features to an existing price point. Don't get me wrong, the 6950XT is faster in purely raster-based performance, but I think if you have this budget it's because you don't want purely raster-based performance: You want to move all the sliders to the right and tick all of the boxes in the options menu.

I dislike the 4070 but I've put it on his radar I feel better about it than an 800usd card with 12GB of vram but the 7900Xt is about 32% faster in raster for about 200 bucks more and that is what he primarily cares about.

There are, what, eleven games with DLSS3 FG so far?
It's RTX's launch all over again. By the time enough games support DLSS3 FG to justify buying a card on that feature alone, the 40-series is going to be as obsolete as the 20-series is now.

I personally really like FG but it should be a bonus not something a person should buy a card for.
 
Agreed, and IMO those are nine games too much. The industry should have simply rejected such a blatantly one-sided, proprietary and elitist "tech" that they are shamelessly gating from their own existing customers in a shameless upsell.
I counted, rather than pulling a number out of my ass, and it's 11, not 9.
Still, yes. DLSS3 is a nice luxury feature to enable if you're already running the heck of out the game but it's not a solution to needing a more powerful GPU.
 
So we now have fake bandwidth specs to go with fake frames.

These cards are utter trash. People that think oh look the 4060 isn't dearer than last gen, need to realiss this POS has a50 class die, has 50 class bus width and 50 class bandwidth but we now throw in L2 cache to make it look like bandwidth is much better.

4060 is a 4050 Ti, 4060 Ti 8GB is the 4060 and only 16GB Ti should be called 4060 Ti and even then to justify $500 needed more cores at least. It should have been the 192 bit 12GB card, and 4070 Ti should be 256 bit 16GB cut down 4080. Raster improvements are pitiful over last gen.

AMD won't rescue you either, 7600 will also be rubbish class and N33 is said to be much weaker than N32/31 in RT relatively speaking. Ie it's RTing will be far less than the ratio of CU's.
 
Back
Top