There is a price difference. I was hoping the 7800 XT would go SLIGHTLY below $500 USD on black friday or cyber monday deals (at least ONE model...), but I'm starting to think, if anything, it's going to go up shortly. You can get them for $500 and no less from what I can tell, although there are some very nice cards on sale at or around 500 (like 520 or 530) that include the new Avatar game (which I probably won't like but I would LOVE to LOOK AT, lol). Not sure if games come with the 6950 XT, but there are at least 2 models that are not hard to get at $599 (one of them was even 599 a year ago at xmas... i needed a new TV, tho:x ). The price difference is under $100.
As for the VRAM usage, I realize if a game has grotesquely-large textures, etc. then obviously memory bandwith shouldn't allow you to get away with a smaller size of VRAM. Thing is, for me, the last (and ONLY) game I've EVER played that used 8gb, or a little more, of VRAM was Shadow of War like 5 years ago. And it used that much even at 1080p (according to the game, I didn't actually play in 1080p and look at afterburner stats the way I do with every other game I play). I can't remember ever using more than 6 or 7GB in any game I really play, and that's a true rarity (maybe TW:WH1 @ 4k or when i ATTEMPTED part 3 at 4k, lol). I know blockbuster AAA titles have used large amounts of VRAM (8ish+?) for a few years now, but I don't generally play those. Next Gen witcher3 at 4k high/ultra (with 'ghetto' FSR, as I call it on my 1080 Ti, enabled) uses ~6GB tops so far (abotu the same as peak usage in the original game at 4k/ultra, which ii played for 1k hours on my 1080 ti).
Anyway, the way people would try and argue the point I was questioning in my last post (about lower vram, higher bandwith) I thought was more for games that aren't actually using ENORMOUS chunks of data at all times in the ram, but still store tons of things in VRAM for when they are used? And the argument would be that with a much higher bandwith card, the system would be able to move things in and out at a much faster rate, therefore minimizing the need to 'pre-load' the VRAM? Now that I'm typing this, I guess even if that WERE true at ALL it wouldn't matter nearly as much as cache sizes are WAY larger than even a few years ago. I believe the 6950 XT has like 128MB and I think the 7800 XT has half that (although it also has faster clocked ram and higher bandwith, new tech/rdna3, etc.).
Thank you guys for allt he responses. I gather, from what's being said, that my question regarding higher gpu clocks with lower bandwith Vs. lower gpu clocks w/ much higher bandwith having any discernable advantage when paired with an older CPU (such as a 4.7ghz 6700K) in terms of framerate stability isn't really a valid question (doesn't work like that, etc.). So if none of that matters in the least, I think I may just go with the 7800 XT. I would honestly rather have the 6950, I think..., but budget IS a concern and the fact that the 7800 XT MIGHT give it an edge even at 4k in the few games I do play that hvae solid FSR support (Baldurs Gate 3, Witcher 3, God of War, etc., and I think hopefully total war warhammer3) with its "tensor cores" (ai accelerators or w/e amd calls them) is nudging me more and more towards the newer, cheaper card. Too bad it's not Nvidia, tho.... I would much rather have an nvidia gpu, but the 40-series cards are just NOT made with gamers like me in mind (at least not the sub $900 ones). If anyone has a 3080 12gb that they'd be willing to sell fairly cheap, PM me and I may be interested. Or a 6950 for that matter

)
Thanks again. Sorry I type so much.
what about an old 6c/12t CPU? My brother wants a new card, too. He also has a 1080 Ti (EVGA limited Black FTW - one of hte only 1080 Ti's with overclocked memory). I told him he may only see benefits with DLSS/FSR (and frame generation), and not so much natively, but he doesn't care. His goal is to play BG3 @ 4k/60 (and he's veyr anal about ANY dips in framerate). He has a 10-yr old X-class intel build with what I believe was their first mainstream/enthusiast 6c/12t chip, i think it's the 3930K or something. He rarely uses his PC these days and will be upgrading his PC sooner than I will be, but when he did use it last he was getting 1440/ultra/60 in AAA games he plays like Remnant, that Star Wars Jedi game from like 2020, and Sekiro. May have even played one of them higher than 1440. So, at 60fps at least, these older flagship systems DO seem to hold up decently.
I assume my 6700K is stronger in new titles than his 3930K, but I could be wrong? I haven't really researched it. But he plays AAA games almost exclusively, and I thought he could actually benefit a decent amount from DLSS or FSR, especially with frame generation/interpolation (with a 4070 or 7800 XT).