• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Plays the VRAM Card Against NVIDIA

Status
Not open for further replies.
Ahh so you now acknowledged VRAM capacity can be an issue then?
Of course it can be, you wouldn't buy a card with 1MB of VRAM for your PC in 2023. That wasn't what we were talking about.
 
Ahh so you now acknowledged VRAM capacity can be an issue then?
Now we're in the "but raytracing doesn't matter since anyone can just turn on ultra full unreal++ path tracing and that makes all RT insufficient" or "anyone can just multiply triangles on every model x4" or "there's a kid somewhere with Photoshop and time on their hands that will take your textures, apply 2x scaling on both axes, call that a HD texture pack" tier of excuses.

Anything to keep the Nvidia Cult pure and untainted by criticism really. Anything that circles back to "Nvidia is never wrong, so they couldn't possibly have cheaped out on VRAM, and you won't need VRAM".
And considering how the Cult works, clearly they'll keep singing that until the 5000s where Nvidia finally corrects and outputs only cards with 16Go and above, save maybe for the lowest end.

AMD called the VRAM rise correctly, that's all there is to it. The ultimate irony is that it comes now, as Nvidia is pushing for an extremely severe price hike. Being cheap with VRAM while you ask for massive hikes is going to put a lot of customers sideways. I'm already seeing a small, but very visible trend of cultists openly questioning how can Nvidia seriously ship a $1000 4070 Ti with too little VRAM for it, or regretting buying 3080s. The 3070 Ti buyers are some sort of first line cannon fodder, but even the other ones are feeling the breath down their neck.

I expect the damage to be quite extensive by the time the 5000s come out. AMD may even have a shot at not being totally terrible, although RDNA 3 started really poorly. All they need to do is release cards at decent prices with a decent VRAM buffer(16Go, nothing below). I expect that their Navi 33 will be total ass because it's still a 128 bit bus, so unless they release double VRAM on the bus, which is unlikely, it's going to be a bunch of just pointless cards, might as well just get a 6600 as it'll do just as good in the long run. But for Navi 32, and obviously 31, there is no real insufficiency, all that matters will be price. And I'm not too worried about AMD actually releasing good enough prices.

I am worried that these guys are in such a bad spot that 5 months later, there's still not a peep about any new Navis, or any kind of leak, or anything really, and the driver work on Navi 31 seems to be a very painful slow grind. You should see the power usage curve on my 7900 xt, it's sometimes looking like an sismograph with how much it peaks/falls/peaks/falls/peaks/falls...
 
Last edited:
All they're showing is how shit PC ports are.
back days GTA V is also "port", but, it runs on any old-low-end-trash with minimum settings EZ, and runs very well on 2 or even 1 GB VRAM cards...

The Australian YouTube channel can’t remember it’s name did a very telling video last week.
textures disappearing and games looking like shit because of it on 8GB cards.

with dram prices collapsing like now there is no reason for nvidia to not make the 4070 a 16gb card
the Nvidia marketing bureau just don't give a fuNk about DRAM prices collapse. What they do give is, though, is Nvidia GPU prices blowing.;)

They have 4 pictures with charts on their blog post

View attachment 291193

View attachment 291194

View attachment 291195


View attachment 291196


The higher we move in tiers, the more green lines we see at the bottom. Also in the 7900XT vs 4070Ti the Perf/$ comparison is missing. I wonder why.

AMD does have a point here, and Hardware Unboxed did offered us a good reason to avoid models that are VRAM limited even today.

it's like comparing Nvidia GT 730 2 GB over 4 GB versions... of course schoolboy or "wannabe-geek" or "I don't know anything about PC but I love BIG NUMBAZZ" will instantly purchase card with more VRAM completely ignoring power requirements or GPU chip performance... or, some technologies, and by "technologies" I don't mean "RT" or even "DLSS" or "reflex" marketing cr*p but, for example, real deal - NVENC.
 
back days GTA V is also "port", but, it runs on any old-low-end-trash with minimum settings EZ, and runs very well on 2 or even 1 GB VRAM cards...


the Nvidia marketing bureau just don't give a fuNk about DRAM prices collapse. What they do give is, though, is Nvidia GPU prices blowing.;)


it's like comparing Nvidia GT 730 2 GB over 4 GB versions... of course schoolboy or "wannabe-geek" or "I don't know anything about PC but I love BIG NUMBAZZ" will instantly purchase card with more VRAM completely ignoring power requirements or GPU chip performance... or, some technologies, and by "technologies" I don't mean "RT" or even "DLSS" or "reflex" marketing cr*p but, for example, real deal - NVENC.

Let's ignore the brand name on cards or marketing slides and see what Nvidia buyers of 8GB cards are losing, by accepting that limitation.

 
Absolutely brilliant, people will say Hogwarts is only using 6.3 gigs, so is fine on a 8 gig card, yet look at the issues with the stuttering caused by asset swapping and in the reviewers words "muddy textures".

Some other games were tested and he confirmed of course the game handles it automatically by not loading the proper textures, or if it doesnt games crashed. I think this video finally is showing the actual problems, and reviewers should update how they review cards in future when assessing VRAM. Dont leave benchmarks unattended e.g. as you have no idea whats going on.
 

Attachments

  • 8gigsenough.png
    8gigsenough.png
    2.7 MB · Views: 75
Last edited:
Absolutely brilliant, people will say Hogwarts is only using 6.3 gigs, so is fine on a 8 gig card, yet look at the issues with the stuttering caused by asset swapping and in the reviewers words "muddy textures".
You see what you want to see, I guess. If it's using a little over 6GB on an 8GB card, that issue is caused by poor asset management, not lack of VRAM. Or at least that's what my programmer eyes see.
 
You see what you want to see, I guess. If it's using a little over 6GB on an 8GB card, that issue is caused by poor asset management, not lack of VRAM. Or at least that's what my programmer eyes see.
To a gamer that isnt relevant though, we cant modify the source code of the game.

Bear in mind it might be e.g. to load the proper textures it needs 9 gigs, the 9 gigs isnt there, so you then use the 6.39 gigs for the lower quality textures.

I am simply looking at it from a gamer's eyes, its there right in front of you, muddy textures and momentary freezes due to lack of VRAM.

Do you go on a campaign to get 1000s of dev's to code games differently (lower the quality available), or do you simply just add more VRAM to hardware?
 
To a gamer that isnt relevant though, we cant modify the source code of the game.

Bear in mind it might be e.g. to load the proper textures it needs 9 gigs, the 9 gigs isnt there, so you then use the 6.39 gigs for the lower quality textures.

I am simply looking at it from a gamer's eyes, its there right in front of you, muddy textures and momentary freezes due to lack of VRAM.

Do you go on a campaign to get 1000s of dev's to code games differently (lower the quality available), or do you simply just add more VRAM to hardware?
If you let developers off the hook, hardware will never be able to keep up.

Also, what you're describing would only happen if developers were all noobs. That's not how textures are handled (look up mipmaps, if you're curious).
 
If you let developers off the hook, hardware will never be able to keep up.

Also, what you're describing would only happen if developers were all noobs. That's not how textures are handled (look up mipmaps, if you're curious).
They already rule the roost. That wont change.
 
They already rule the roost. That wont change.
Well, if you think hardware is easier to fix than software... more power to you.
 
Well, if you think hardware is easier to fix than software... more power to you.
Its a centralised component, vs a big mesh of software and developers, so yep I think its easier. Especially when you consider many games are no longer maintained anymore so will never get fixes.
 
If you let developers off the hook, hardware will never be able to keep up.
Hilarious.

Reminds me of that Intel engineer that summarised the entirety of the tech industry some ten years ago:
"No matter how much more power we build, they manage to use it!"

"letting developers off the hook" is what we've been doing since 1970. It's what we'll be doing until 2070. It's why Lisa Su talks about Zettascale.
Whatever line you think to hold on to will be broken, that's the very point of this industry.
It's TechPowerUp, not TechPowerStay!
 
Hilarious.

Reminds me of that Intel engineer that summarised the entirety of the tech industry some ten years ago:
"No matter how much more power we build, they manage to use it!"

"letting developers off the hook" is what we've been doing since 1970. It's what we'll be doing until 2070. It's why Lisa Su talks about Zettascale.
Whatever line you think to hold on to will be broken, that's the very point of this industry.
It's TechPowerUp, not TechPowerStay!
I'm not thinking about holding a line, progress will obviously happen. At the same time, I will not buy hardware I don't otherwise need because some random developer is too lazy to optimize their work. And by "optimize", I mean make it work at least as well as other implementations do
 
Well, if you think hardware is easier to fix than software... more power to you.
A hardware fix is feasible for the end user. Holding devs accountable? Yeah right, consumers have tried (and failed) to get them to listen.

I can buy a 16G GPU. I cannot give a dev a poor employee review for shoddy code.
 
A hardware fix is feasible for the end user. Holding devs accountable? Yeah right, consumers have tried (and failed) to get them to listen.

I can buy a 16G GPU. I cannot give a dev a poor employee review for shoddy code.
You can. You play something else instead. Many developers receive compensation based on how their title fares in the first few weeks (yep, that's official hit-and-run tactics).
 
You can. You play something else instead. Many developers receive compensation based on how their title fares in the first few weeks (yep, that's official hit-and-run tactics).
Try convincing people, especially young ones, that they have to avoid Game X, that everyone is playing, because it is not optimized as it should. Good luck with that.
It's not uncommon people to be finishing games in their first buggy version(s), instead of waiting for new patches that improve quality and performance. It's probably the rule, not the exception. People having the patience to wait for proper patches and price cuts, are not the majority. If they where, companies wouldn't be rushing to bring games out in probably what someone would describe as beta versions.

As for VRAM and developers. Considering that system requirements are what is pushing people to upgrade, I bet that sponsored titles from AMD, Intel or Nvidia are not coming with an obligation to developers to do their best in optimizations and memory management.
 
You see what you want to see, I guess. If it's using a little over 6GB on an 8GB card, that issue is caused by poor asset management, not lack of VRAM. Or at least that's what my programmer eyes see.

In the current top-rated gaming dominion, irrefutably, asset management or optimisations is "secondary" to hardware limits. The primary offender can't be so easily pardoned here.

Card manufacturers are and have been for years hampering "progress" with curtailed VRAM rations for the lower/mid-segment performance minions. 2023 and in some supposedly graphics immersive titles we're still getting 10-year old polished texturisation in modern titles... rotten to the bone! Slap "ultra" on it and people get hypnotized by it.

Game Devs are simply not getting a large enough code-artist paint pallete to push on the more enriched or more consistent visual fidelity which is "readily accessible". In return, the 8GB limited crowd is compelled to settle with assets substitutions and now what seems to be top-graft optimisations which spells out "visual quality impairment". Some of us have become accustomed to these constraints with just about acceptable high/ultra graphics eye-candy without realising missed upper-class graphics opportunities. Thats the Nature of VRAM limitations, too many bottom-barrel feeding compromises.

Another thing to consider... for sometime now, we the consumer are paying VRAM-taxing RT/PT levies. Its completely mind-baffling having to pay these extortionately higher premiums and yet to be presented with memory bottlenecks. Even with RT disabled, the bottleneck for premium quality graphics (textures playing a big part) is extremely evident. To offer a little life, dynamic texture asset substitutions are highly observable, especially in denser graphics environments. This is hardly a solution, more of a temporary med-patch before the bigger and better bleeds you to death.

In the grand scheme of things, its not just whether games can comfortably sit in the inferior quality 8GB VRAM bracket but why should we the consumer encourage "inferiority" to exist in the first place? Forget the recent outcry from AMD or independent reviewers, more VRAM was always a desirable longing for game developers to push some of that higher visual fidelity to the mainstream budget/mid-performance gaming segment. There's a lot of "but i can still manage great FPS and smooth gameplay"... course you can, at the cost of less than desirable memory abetted wash-ups. I want a boat load of frames per second, but each frame would be way more nicer with some consistent eye-candy affirmative action opposed to a dynamically repulsive one.
 
With AMD also restricting VRAM to 8GB on their 7600 series cards, this topic will not become an AMD vs Nvidia topic, where the more VRAM will be looking as a "red team argument" and maybe that will be a good thing. Not for consumers that will have to keep getting that VRAM restriction in their face, but at least for more open minded dialog.
Just a reminder. Excluding the 8GB HD 2900/X models and starting with the RX 480 that came out in 2016, we are about to close 10 years and probably keep counting many more, until the sub $300 segment moves over 8GBs of VRAM.
 
With AMD also restricting VRAM to 8GB on their 7600 series cards, this topic will not become an AMD vs Nvidia topic, where the more VRAM will be looking as a "red team argument" and maybe that will be a good thing. Not for consumers that will have to keep getting that VRAM restriction in their face, but at least for more open minded dialog.
Just a reminder. Excluding the 8GB HD 2900/X models and starting with the RX 480 that came out in 2016, we are about to close 10 years and probably keep counting many more, until the sub $300 segment moves over 8GBs of VRAM.

The (insolent) jokes on AMD if these 7600 cards are capped at 8GB.

To think of AMD complaining about nVs lack of adherence and then to air drop 8GB-poop-bombs on their mid-segment cards is just revolting. This is a perfect long-awaited opportunity to size up on VRAM at all performance-tiers and it would be a damn-right shame if the largest consumer group (low/mid perf) are once again consigned to oblivion.
 
The (insolent) jokes on AMD if these 7600 cards are capped at 8GB.

To think of AMD complaining about nVs lack of adherence and then to air drop 8GB-poop-bombs on their mid-segment cards is just revolting. This is a perfect long-awaited opportunity to size up on VRAM at all performance-tiers and it would be a damn-right shame if the largest consumer group (low/mid perf) are once again consigned to oblivion.
Totally agreed, what makes it worse, it's the first GPU they release after that marketing article.
 
Try convincing people, especially young ones, that they have to avoid Game X, that everyone is playing, because it is not optimized as it should. Good luck with that.
It's not uncommon people to be finishing games in their first buggy version(s), instead of waiting for new patches that improve quality and performance. It's probably the rule, not the exception. People having the patience to wait for proper patches and price cuts, are not the majority. If they where, companies wouldn't be rushing to bring games out in probably what someone would describe as beta versions.

As for VRAM and developers. Considering that system requirements are what is pushing people to upgrade, I bet that sponsored titles from AMD, Intel or Nvidia are not coming with an obligation to developers to do their best in optimizations and memory management.
It's just entertainment,I really don't have to convince anyone ;)
It's just puzzling to see a discussion around entertainment turning into hatred towards one side because supposedly they're holding back our entertainment rights.
 
It's just entertainment,I really don't have to convince anyone ;)
It's just puzzling to see a discussion around entertainment turning into hatred towards one side because supposedly they're holding back our entertainment rights.
Oh come on. You always see things the way that favor one specific side. In this case, it's the dev's fault.

Also it is puzzling that you talk about not convincing anyone, when above you said that people should play something else instead. Well, guess what. This does mean convincing someone and you know this, because you follow up that suggestion with the argument of devs not getting the compensation they expect. Arguments are used when we try to convince someone about something. Even when we don't intent to insist on our suggestion.

In any case convincing the public about game optimizations is mandatory for forcing developers to make optimization a priority. As long as games sell millions of copies in their original buggy version, optimization will not be the top priority in the development.
 
Oh come on. You always see things the way that favor one specific side. In this case, it's the dev's fault.

Also it is puzzling that you talk about not convincing anyone, when above you said that people should play something else instead. Well, guess what. This does mean convincing someone and you know this, because you follow up that suggestion with the argument of devs not getting the compensation they expect. Arguments are used when we try to convince someone about something. Even when we don't intent to insist on our suggestion.

In any case convincing the public about game optimizations is mandatory for forcing developers to make optimization a priority. As long as games sell millions of copies in their original buggy version, optimization will not be the top priority in the development.
I was telling it how I see it. Some will agree with that, some won't. If you think that's me trying to convince people... I won't try to convince you otherwise.
 
I was telling it how I see it. Some will agree with that, some won't. If you think that's me trying to convince people... I won't try to convince you otherwise.
You didn't broadcasted a message on planet wide television. I am not saying you are trying to convince anyone HERE with a couple of posts in a forum. You are not starting a movement for optimized games.
But saying "You play something else instead" or "Many developers receive compensation based on how their title fares" both those to happen does mean convincing the absolute majority of gamers to avoid games that need more than 8GBs of VRAM, because of bad optimization and not because of absolute need for more VRAM. Because there is also the case where the extra VRAM is needed no matter how much optimization someone does to a game.
Now realistically, most people waiting for a game, don't wait an extra 6-12 months for it to get optimized and games with a gazillion of bugs do sometimes sell extremely well. Meaning the extra VRAM is the only workaround available to consumers.
 
You didn't broadcasted a message on planet wide television. I am not saying you are trying to convince anyone HERE with a couple of posts in a forum. You are not starting a movement for optimized games.
But saying "You play something else instead" or "Many developers receive compensation based on how their title fares" both those to happen does mean convincing the absolute majority of gamers to avoid games that need more than 8GBs of VRAM, because of bad optimization and not because of absolute need for more VRAM. Because there is also the case where the extra VRAM is needed no matter how much optimization someone does to a game.
Now realistically, most people waiting for a game, don't wait an extra 6-12 months for it to get optimized and games with a gazillion of bugs do sometimes sell extremely well. Meaning the extra VRAM is the only workaround available to consumers.
Yeah, I don't get that part either. Save from publisher's hit-and-run tactics, I don;t understand why anyone would be "waiting for a game" and must play it within a month from release. This used to be a thing in the 80s and 90s when you had to purchase a physical copy, otherwise you were out of luck. But today, games are available pretty much everywhere and can be played at any time.
 
Status
Not open for further replies.
Back
Top