• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

upgrading gpu in my 6700K-based (aging) PC for 4k 60fps gaming

Funny to think how many upgrades I've had in that time myself.

Pentium G4400 -> G4560 -> 7600K -> 7700K -> 5820K -> R5 2600 -> R5 3600 -> R7 5800X
R9 290 -> GTX 780 Ti -> GTX 970 SLI -> GTX 980 -> GTX 780 -> R9 290 CF -> GTX 980 Ti -> GTX 1080 Ti -> RX 6700 XT

I guess that it's the benefit of having a high-end system is that you don't need to upgrade so often unlike I do. :laugh:
Or one upgrades often because they're not happy for long.
 
What were you using prior to the 6800xt? How much improvement did you see? Also, do you use FSR usually? Or always go native if possible?

And, yeah man, regarding fps... for one thing I only game on TVs nowadays, so 60fps is max for me. Reason I don't mind it so much is I use interpolation on my TV and that makes it seem like ~90fps - really smooth on such a big screen, especially in slower games like 4X/Civ type games where you kinda pan around (which I use a steam controller for and it's been a literal game changer for me since I started doing that in 2016 w/ my TV). Until I get around to a TOTAL (and expensive) overhaul and new build maybe 2 years down the road, 60fps w/ vsync and I'm golden.

This PC cost me nearly 3grand to build and that was in early 2016 (with the initial 980 Ti); I'm squeezin it for all I can - i's been on 24/7/360 for nearly 8 years. I say 360 because it's turned off for maybe 5 days out of the year, and it's been close to FLAWLESS (I htought I'd be re-installing windows10 every few months or so since I am anal about my OS being clean and lean, but to my shock Win10 has been so great I never had to do it ONCE). I love this PC. :|

Before I went 4K I had a 1080p Samsung TV screen and was using a GTX 960 with it. Went with the 4k screen upgrade and a 6800xt during Covid lockdowns, also changed the PSU to a 850w CWT based ADX psu (Transient spikes might be an issue with RDNA 2 for you, bit of a PSU killer).

I don't play many newer games, mainly insurgency sandstorm (and others play WOT, Fortnite etc) though I got hold of the Harry Potter game just to test, that runs fine with small settings tweaks. All native res, I never use upscaling, didn't spend 1K on the card and similar on a screen to see fuzz. I'm also not interested in RT, which is a reason I went AMD. I also heavily tweak the 6800xt settings, fast timings, undervolt and higher boost (1075mv and 2550 ish works well in most games), for 15%+ over stock performance.

It'll depend though on what games you play though and exactly what your use case for what works for you.
 
I game almost exclusively at 4k (on a 55" 60hz tv), and I don't game lower than 60fps ... the goal is to continue to play the games I play at 4k/ultra/60.
Of the three cards you mentioned, the 6950XT is the one most suitable for 4K. However, keeping a constant 60 fps in native 4K on ultra, especially in newer games, is a tall order. And I don't mean the averages, but the percentile lows, perceived as stutter/hitching. Also, in order to take advantage of the faster GPU, most games will require a CPU with equal performance.

"Beefiest" game I'm playing atm is the next-gen version of Witcher 3 ... I would like to push that up to max settings at 4k, WITHOUT FSR/DLSS if possible (but if I still need it then no big deal).
You will need at least a 7900XT or a 4070Ti to see 60 fps in the most graphically demanding areas of the game, without sacrificing quality. In fact, these cards are my minimum recommendation for 4K60 on ultra. But if you're happy with FSR/DLSS balanced, even a 6800 or a 4060Ti would do.

I also play a lot of 4x-type games, which are usually fine, but in the last year or so there have been a few big games that I am unable to play, such as Age of Wonders 4 (which my system can't handle at 1440p even near max/60, and is one of my most-wanted games).
You are being held back by the ST performance of your CPU in the AoW series. The Creator engine they're based upon is notorious for relying on a single thread. As has been said, the 6700K may already be bottlenecking your current graphics card in certain games. Please check GPU utilization in-game with V-sync off and no frame rate cap -- anything below 90% will point to the CPU not keeping up.

The 6700K @ 4.7 GHz has about the same ST performance as a stock 3300X, and your 1080Ti is roughly equivalent to a 2070S:

1701009486363.png


I also own RDR2 and God of War on steam, and I'd LOVE to play those at 4k as well.
These are slightly less demanding titles, and something like a 7800XT shouldn't have a problem keeping 60 fps in maxed out 4K.

I also play all 3 Total War: Warhammer games at 1440p near max (well, part 1 I can play at 4k near max)and I would LOOOOVE to play those on the highest settings I possibly can at 4k (I'd even be fine with 1440p MAX for TWW3 if I had to..).
In heavy battles, only top-end cards will be able to maintain constant 60 fps in TW:W3 in 4K ultra. For 1440p you'll be fine with a 7800XT as well.

Oh, one other AAA game I'm planning to play asap once I upgrade is Baldur's Gate 3 (so I can play at 4k/60 - I currently can but only with FSR 1,which is wha they use I think, and obv not close to max).
Again, the 7800XT should be able to hit 60 fps most of the time.

I would like to make sure everything runs as smoothly as possible regarding 1% lows
That unfortunately may not be possible without a complete platform upgrade. But if you insist on only purchasing the GPU at this time, given your three choices I'd go with the 7800XT. It's a very capable 1440p card that will let you enjoy many games in 4K60, with a little bit of tweaking.
 
Last edited:
There is a price difference. I was hoping the 7800 XT would go SLIGHTLY below $500 USD on black friday or cyber monday deals (at least ONE model...), but I'm starting to think, if anything, it's going to go up shortly. You can get them for $500 and no less from what I can tell, although there are some very nice cards on sale at or around 500 (like 520 or 530) that include the new Avatar game (which I probably won't like but I would LOVE to LOOK AT, lol). Not sure if games come with the 6950 XT, but there are at least 2 models that are not hard to get at $599 (one of them was even 599 a year ago at xmas... i needed a new TV, tho:x ). The price difference is under $100.

As for the VRAM usage, I realize if a game has grotesquely-large textures, etc. then obviously memory bandwith shouldn't allow you to get away with a smaller size of VRAM. Thing is, for me, the last (and ONLY) game I've EVER played that used 8gb, or a little more, of VRAM was Shadow of War like 5 years ago. And it used that much even at 1080p (according to the game, I didn't actually play in 1080p and look at afterburner stats the way I do with every other game I play). I can't remember ever using more than 6 or 7GB in any game I really play, and that's a true rarity (maybe TW:WH1 @ 4k or when i ATTEMPTED part 3 at 4k, lol). I know blockbuster AAA titles have used large amounts of VRAM (8ish+?) for a few years now, but I don't generally play those. Next Gen witcher3 at 4k high/ultra (with 'ghetto' FSR, as I call it on my 1080 Ti, enabled) uses ~6GB tops so far (abotu the same as peak usage in the original game at 4k/ultra, which ii played for 1k hours on my 1080 ti).

Anyway, the way people would try and argue the point I was questioning in my last post (about lower vram, higher bandwith) I thought was more for games that aren't actually using ENORMOUS chunks of data at all times in the ram, but still store tons of things in VRAM for when they are used? And the argument would be that with a much higher bandwith card, the system would be able to move things in and out at a much faster rate, therefore minimizing the need to 'pre-load' the VRAM? Now that I'm typing this, I guess even if that WERE true at ALL it wouldn't matter nearly as much as cache sizes are WAY larger than even a few years ago. I believe the 6950 XT has like 128MB and I think the 7800 XT has half that (although it also has faster clocked ram and higher bandwith, new tech/rdna3, etc.).


Thank you guys for allt he responses. I gather, from what's being said, that my question regarding higher gpu clocks with lower bandwith Vs. lower gpu clocks w/ much higher bandwith having any discernable advantage when paired with an older CPU (such as a 4.7ghz 6700K) in terms of framerate stability isn't really a valid question (doesn't work like that, etc.). So if none of that matters in the least, I think I may just go with the 7800 XT. I would honestly rather have the 6950, I think..., but budget IS a concern and the fact that the 7800 XT MIGHT give it an edge even at 4k in the few games I do play that hvae solid FSR support (Baldurs Gate 3, Witcher 3, God of War, etc., and I think hopefully total war warhammer3) with its "tensor cores" (ai accelerators or w/e amd calls them) is nudging me more and more towards the newer, cheaper card. Too bad it's not Nvidia, tho.... I would much rather have an nvidia gpu, but the 40-series cards are just NOT made with gamers like me in mind (at least not the sub $900 ones). If anyone has a 3080 12gb that they'd be willing to sell fairly cheap, PM me and I may be interested. Or a 6950 for that matter ;))

Thanks again. Sorry I type so much.


what about an old 6c/12t CPU? My brother wants a new card, too. He also has a 1080 Ti (EVGA limited Black FTW - one of hte only 1080 Ti's with overclocked memory). I told him he may only see benefits with DLSS/FSR (and frame generation), and not so much natively, but he doesn't care. His goal is to play BG3 @ 4k/60 (and he's veyr anal about ANY dips in framerate). He has a 10-yr old X-class intel build with what I believe was their first mainstream/enthusiast 6c/12t chip, i think it's the 3930K or something. He rarely uses his PC these days and will be upgrading his PC sooner than I will be, but when he did use it last he was getting 1440/ultra/60 in AAA games he plays like Remnant, that Star Wars Jedi game from like 2020, and Sekiro. May have even played one of them higher than 1440. So, at 60fps at least, these older flagship systems DO seem to hold up decently.

I assume my 6700K is stronger in new titles than his 3930K, but I could be wrong? I haven't really researched it. But he plays AAA games almost exclusively, and I thought he could actually benefit a decent amount from DLSS or FSR, especially with frame generation/interpolation (with a 4070 or 7800 XT).
In TW Warhammer you will be heavily CPU bottlenecked. You simply cant play that proper on a 4c/8t anymore. Part 2 is problematic. Part 3 impossibru. Its prob okay at campaign start, but 50 turns later its a slideshow or stutterfest. FSR3 wont save you.
 
money better spent IMHO- https://pcpartpicker.com/list/B4RFKX

Run that 1080ti a little longer and actually see what it can do. When you can afford it, upgrade the 1080ti to whatever you decide on then.
 
Of the three cards you mentioned, the 6950XT is the one most suitable for 4K. However, keeping a constant 60 fps in native 4K on ultra, especially in newer games, is a tall order. And I don't mean the averages, but the percentile lows, perceived as stutter/hitching. Also, in order to take advantage of the faster GPU, most games will require a CPU with equal performance.


You will need at least a 7900XT or a 4070Ti to see 60 fps in the most graphically demanding areas of the game, without sacrificing quality. In fact, these cards are my minimum recommendation for 4K60 on ultra. But if you're happy with FSR/DLSS balanced, even a 6800 or a 4060Ti would do.


You are being held back by the ST performance of your CPU in the AoW series. The Creator engine they're based upon is notorious for relying on a single thread. As has been said, the 6700K may already be bottlenecking your current graphics card in certain games. Please check GPU utilization in-game with V-sync off and no frame rate cap -- anything below 90% will point to the CPU not keeping up.

The 6700K @ 4.7 GHz has about the same ST performance as a stock 3300X, and your 1080Ti is roughly equivalent to a 2070S:

View attachment 323068


These are slightly less demanding titles, and something like a 7800XT shouldn't have a problem keeping 60 fps in maxed out 4K.


In heavy battles, only top-end cards will be able to maintain constant 60 fps in TW:W3 in 4K ultra. For 1440p you'll be fine with a 7800XT as well.


Again, the 7800XT should be able to hit 60 fps most of the time.


That unfortunately may not be possible without a complete platform upgrade. But if you insist on only purchasing the GPU at this time, given your three choices I'd go with the 7800XT. It's a very capable 1440p card that will let you enjoy many games in 4K60, with a little bit of tweaking.
Yes, of all the games I have tried playing so far, AoW4 was the one that I kinda felt right away my CPU was an issue. But I'm sure I will see SOME. benefit in the game with a gpu updrade nonetheless.

As for W3, I'm able to play the whole thing with my current setup at 4k w/ everything on HIgh and one setting on Ultra (I forgot which, particle effects or textures or something - it's been a few weeks since I've played), with FSR on balanced. A buttery 60 w/ vsync is possible throughout the entire game save VERY few areas when certain weather effects show up (there's one part of a forest, forgot which, with lots of wolves and fog/rain that always dips me into the high 40s for a sec here and there - but I avoid it). Other than that, I have to TRY and get it to drop by doing things like cramming myself in the corner of an inn and getting the camera into a very specific position close to my face/hair. It's a game I played when I was bored with the older edition as well. That game is VERY well optimized and runs quite well at 60fps on older chips it seems.

TW:WH is impossible to keep at 60 at all times for me, but for some reason that's the ONLY game where I honestly don't notice much and don't really care when it dips in heavy combat. As long as I get 60 on the campaign map and deployment I'm good (which I can't with part 3 in certain areas unless I'm at 1440 med/high and never with reflections on - with 2 I'm fine at 1440p and part 1 I go to 4k).

I understand there will be games where my cpu holds me back, but honestly I think people generally underestimate older CPUs. Gaming at 4k or 1440p at 60 isn't that tall of an order if you aren't playing the most bleeding-edge AAA blockbusters or notoriously ambitious and horribly-optimized AA/indie titles.

I'm quite eager to see for myself just what kind of improvements going from a 1080ti to a 7800 XT will net me. I'll return to this thread (unless the mobs would rather I not resurrect it) in a few weeks after I've had a chance to try it out a bit. And if it's a sh**t upgrade, I can always return it for a full refund to where I'm getting it from, and I'll just start slowly saving for my next killer PC build down the line.

money better spent IMHO- https://pcpartpicker.com/list/B4RFKX

Run that 1080ti a little longer and actually see what it can do. When you can afford it, upgrade the 1080ti to whatever you decide on then.
When I build a full PC I will want near-flagship parts across the board. This is one of the reasons I'm avoiding upgrading the platform atm, even though newegg has a deal with a 13700k+gigabyte mobo +32gigs ddr5 gskill ram for $499 (30 less than the 7800 XT I'm eyeballing). I have other reasons as well.. But, yea, I generally want my PC to be as ROCK solid and capable (OC, etc.) as possible since I have it on 24/7 all year long. Budget for the PC will most likely be around 2-2.5k sans gpu, which I can't afford just now.

fwiw, the card I'm "eyeballing" is the ASRock Phantom Gaming 7800 XT. It's sold out on Amazon but it's still on Newegg for $530 w/ a free copy of the new Avatar game (eye candy only... I don't imagine I'd enjoy the game itself but who knows). I LOVE the looks of that card, and the few reviews I've watched of it have stated it has near (or actual) best-in-class cooling/performance when it comes to the current stock of 7800's (with the Nitro+ being close). If anyone has any other suggestions or reasons that specific card is NOT good, I'm all ears.

I've also just read a few places that ASRock plans to implement its own version of AI upscaling, and it will be proprietary and only work on ASRock gpus.... This wasn't really something I weighed heavily when making my choice, but it seems interesting.
 
man..... after doing a lot of research I thought I was making the right move. Maybe I still did... time will tell. I ended up getting a 7800 XT (ASRock Phantom Gaming) for $530 on newegg. They were sold out on Amazon and newegg had it as a cyber monday deal (with avatar game), and fomo'd me with "more than 184 ppl have this in their cart", lol. Anyway, I chose that card due to reviews saying it had very good performance and cooling (near best in class, even beating the Sapphire Nitro, which I know is usually one of the top AMD cards). I also love the way it looks, AND I read something about ASRock working on their own AI upscaling tech that will ONLY work with ASRock cards (didn't weigh much into my decision, but it is something to think about). I am also hedging on AMD doing something (soon?) involving the new AI Accelerators on these 7800 XT cards. I really hope they do, because the fact that Nvidia cards can choose between both DLSS *and* FSR seems really nice (even though, at least in stills, FSR 2 looks far superior to DLSS to me, as I'm not a fan of the blur - maybe in action it's harder to tell). I just really hope I made the right choice because it's also going to be a gift for my brother for HIS computer (very simiar to mine but slightly older - although his processer is 6c/12t, also running at 4.7ghz - but he will be upgrading his platform much sooner than I will).

The one thing I'm starting to feel bummed about is that the price on ALL of the 7800 XT's has remained the same, even past black friday and cyber monday, etc. In fact, there's now a cheaper one on the market (altthough it's $15 cheaper I wouldn't have bought that model). And.....sure enough.... ALL the 6950 XT's that were $599 on both Newegg and Amazon (2 models on each store) are now either out of stock or back up to $800..... I really hope I made the right choice :x

Does anyone know anything about what AMD plans to use the "AI Accelerators" for? Maybe they have their own version of Nvidia's DLAA coming out? Or a way to squeeze more out of FSR (as I guess FSR works just as well for EVERY card, AI-specific cores or not, so no need for them there ATM...). We (me and my brother) certainly won't be doing anything other than gaming with the cards, so it's not like we're going to get any use out of them using TensorFlow or PyTorch or w/e it is people play with nowadays. And a big part of what drove me towards the 7800 XT's over the 6950's, even though we both game at 4k, is the fact that none of the older AMD cards had any answer to Nvidia's "Tensor Cores" until now (along with whatever benefits rdna3 may bring that would 'push' these 7800's ahead of the older 6950's I was also heavily considering).
 
Last edited:
7800xt is a great gpu and will power you for years to come, now game on
now ya gotta plan for the cpu/ram/mobo after this and see your 7800xt fly..
 
Funny to think how many upgrades I've had in that time myself.

Pentium G4400 -> G4560 -> 7600K -> 7700K -> 5820K -> R5 2600 -> R5 3600 -> R7 5800X
R9 290 -> GTX 780 Ti -> GTX 970 SLI -> GTX 980 -> GTX 780 -> R9 290 CF -> GTX 980 Ti -> GTX 1080 Ti -> RX 6700 XT

I guess that it's the benefit of having a high-end system is that you don't need to upgrade so often unlike I do. :laugh:

I would really love to know why you went to a 5820K after having a 7700K (and, obv, form the newer mobo/platform to the older one). I had a lot of trouble deciding whether to go with a 6700K or the 5820K when I first built this PC in March of 2016. That decision was driving me nuts, haha. If you're doing more than just gaming then I guess you can ignore this, since obviously that chip was far superior in multi-threaded work (like, actual "work") loads, lol.

Funny thing is that almost ever since I built the thing I've seen NO ONE discussing the 5820K anymore (not in any more benchmarks for games or anything). I've seen SO many threads with people asking questions about their 6700K systems (like this one, heh) over these years but I don't think I've ever come across a single one that was the same type of post but regarding a 5820K system. Maybe it's cause most of the people who owned that platform were more enthusiast-types and just ended up jumping to a new system in a generation or two... I guess that's all I can think of.
 
Funny to think how many upgrades I've had in that time myself.

Pentium G4400 -> G4560 -> 7600K -> 7700K -> 5820K -> R5 2600 -> R5 3600 -> R7 5800X
R9 290 -> GTX 780 Ti -> GTX 970 SLI -> GTX 980 -> GTX 780 -> R9 290 CF -> GTX 980 Ti -> GTX 1080 Ti -> RX 6700 XT

I guess that it's the benefit of having a high-end system is that you don't need to upgrade so often unlike I do. :laugh:
In that same time I went from

i5 3570k ----------------->i7 9700k -------->5800x3d
GTX 770 SLI -> RX 480 -> Vega64 (failed) ->RX 6800xt.

And honestly I should have skipped the 9700k. It didnt fix my issue with Sins of a solar empire, the x3d did. Should have kept that old ivy bridge a few years longer. And I'd still be using the vega if it didnt self destruct
 
I would really love to know why you went to a 5820K after having a 7700K (and, obv, form the newer mobo/platform to the older one). I had a lot of trouble deciding whether to go with a 6700K or the 5820K when I first built this PC in March of 2016. That decision was driving me nuts, haha. If you're doing more than just gaming then I guess you can ignore this, since obviously that chip was far superior in multi-threaded work (like, actual "work") loads, lol.

Funny thing is that almost ever since I built the thing I've seen NO ONE discussing the 5820K anymore (not in any more benchmarks for games or anything). I've seen SO many threads with people asking questions about their 6700K systems (like this one, heh) over these years but I don't think I've ever come across a single one that was the same type of post but regarding a 5820K system. Maybe it's cause most of the people who owned that platform were more enthusiast-types and just ended up jumping to a new system in a generation or two... I guess that's all I can think of.
Just for pure curiosity to test the X99 platform :D
 
I haven't had time to scan the whole thread yet but my first thought is that a quad-core will be the bottleneck.

Check your 6700K can actually deliver 60fps by running the game at 1080p. If you get stutters and frame drops in the games you're interested in, split your budget so that you can afford at least a B550 board and 5700X CPU upgrade, IIRC that's still a cheaper option than Intel 12th/13th gen.
 
I haven't had time to scan the whole thread yet but my first thought is that a quad-core will be the bottleneck.

Check your 6700K can actually deliver 60fps by running the game at 1080p. If you get stutters and frame drops in the games you're interested in, split your budget so that you can afford at least a B550 board and 5700X CPU upgrade, IIRC that's still a cheaper option than Intel 12th/13th gen.
Most of the games I want to play are games I can currently easily get 60fps (or more) at 1080p or 1440p such as RDR2, God of War, Basically all new CRPGs, Octopath Traveller 2 (for whatever reason I cannot get a solid 60fps at 4k in this game - it dips in certain spots), Total War Warhammer 2/3 (1 i can already play at 4k "sufficiently", as it's impossible to lock 60fps in these games for all but the most monstrous systems, if at all), Gloomhaven, Nioh 1 & 2, Tainted Grail Clash: Artifacts of Chaos, Old World, Spellforce: Conquest of Eo, Witcher 3 NGE....

I just listed a bunch off the top of my head to give an idea of what I play. Most of these are a few years old, and the newer ones are not big AAA titles. Some are just poorly optimized. Some of these games I'm even playing at 4k close to ultra with FSR enabled and I'd like to try and play at native ultra or at least go ALL the way to ultra w/ FSR. Just being able to play these games on higher settings or resolutions would be enough to warrant the upgrade for me, but there are some newer games I'd like to try as well, even though I'm not sure I'll be able to handle them. I'm mainly hoping to just boost the games I'm already able to play.

So.. this isn't really an upgrade with many future games in mind, at least prior to platform upgrade (although, of course, I will play some). Again, off the top of my head real fast I can name a few - Age of Wonders 3, Moonbreaker, Wukong Blackmyth (if it ever releases), Eyuden Chronicles 100 Heroes (may not even need an upgrade here but idk). See, I can't even think of any big blockbuster AAA titles atm, lol (it's late, tho). Maybe I'd try C77? Idk.. Also, any new fantasy 4X titles or Civ/Civ-like titles (civ 7? hopefully), although again I know it's not a sure thing even with these games - they may very well want a better cpu as well. By that time I should have at least a used temporaty platform upgrade, tho (like, prior to civ7 releasing lol).

OH.... forgot: I also want to try getting back into WoW. Maybe. Casually. I used to be a super hardcore raider (top20US) back in tbc-cata. Obviously the classic version already runs on my computer at 4k brilliantly ( I played a bit of tbc classic in 2021), but I'm curious how the new retail stuff would run with my current processor and a 7800 XT. Not sure if it'd hold up in raids, but just being able to run 5mans again (esp at 4k, if possible, but at least 1440 ultra) would be amazIng.


Won't be installing the card until around Christmas, as one is a gift for my brother and we live together. Don't wanna bust my new card out til I can give it to him. I'll edit this post with "results" afterwards.
 
Last edited:
Most of the games I want to play are games I can currently easily get 60fps (or more) at 1080p or 1440p such as RDR2, God of War, Basically all new CRPGs, Octopath Traveller 2 (for whatever reason I cannot get a solid 60fps at 4k in this game - it dips in certain spots), Total War Warhammer 2/3 (1 i can already play at 4k "sufficiently", as it's impossible to lock 60fps in these games for all but the most monstrous systems, if at all), Gloomhaven, Nioh 1 & 2, Tainted Grail Clash: Artifacts of Chaos, Old World, Spellforce: Conquest of Eo, Witcher 3 NGE....

I just listed a bunch off the top of my head to give an idea of what I play. Most of these are a few years old, and the newer ones are not big AAA titles. Some are just poorly optimized. Some of these games I'm even playing at 4k close to ultra with FSR enabled and I'd like to try and play at native ultra or at least go ALL the way to ultra w/ FSR. Just being able to play these games on higher settings or resolutions would be enough to warrant the upgrade for me, but there are some newer games I'd like to try as well, even though I'm not sure I'll be able to handle them. I'm mainly hoping to just boost the games I'm already able to play.

So.. this isn't really an upgrade with many future games in mind, at least prior to platform upgrade (although, of course, I will play some). Again, off the top of my head real fast I can name a few - Age of Wonders 3, Moonbreaker, Wukong Blackmyth (if it ever releases), Eyuden Chronicles 100 Heroes (may not even need an upgrade here but idk). See, I can't even think of any big blockbuster AAA titles atm, lol (it's late, tho). Maybe I'd try C77? Idk.. Also, any new fantasy 4X titles or Civ/Civ-like titles (civ 7? hopefully), although again I know it's not a sure thing even with these games - they may very well want a better cpu as well. By that time I should have at least a used temporaty platform upgrade, tho (like, prior to civ7 releasing lol).

OH.... forgot: I also want to try getting back into WoW. Maybe. Casually. I used to be a super hardcore raider (top20US) back in tbc-cata. Obviously the classic version already runs on my computer at 4k brilliantly ( I played a bit of tbc classic in 2021), but I'm curious how the new retail stuff would run with my current processor and a 7800 XT. Not sure if it'd hold up in raids, but just being able to run 5mans again (esp at 4k, if possible, but at least 1440 ultra) would be amazIng.


Won't be installing the card until around Christmas, as one is a gift for my brother and we live together. Don't wanna bust my new card out til I can give it to him. I'll edit this post with "results" afterwards.
It sounds like you'll be fine with the 5700G - gaming performance will be adequate even if you're not getting the most of of a 7800XT.

Grand strategy and thousand-unit RTS/Sim games are often the ones that most benefit from an x3D CPU upgrade. They're often choked by a single thread, so cache and to some extent clockspeed are more important than core counts. There's no rush on that upgrade yet - you could wait until a game comes along that runs like shit on your 5700G and then see what the benchmarks for different CPUs look like for that game.
 
Well for now it'll just be the same ol' system with the 6700K (but with the 7800 XT). In time I am looking into grabbing a used mobo+cpu to hold me over until much later when I do a full-on, heavily-researched, more expensive (5-10yr) build, but that's probably not until early spring. For the next few months I'll be on my 4.7ghz 6700K (and he'll be on his 4.65ghz 3930K, but again he'll be upgrading cpu sooner than I).

May I ask - how exactly does cpu bottlenecking show up in games? It is not obvious most of the time? Or will it usually be something like - can't hit 60fps even though my gpu isn't showing 100% utilization or power, and cpu has at least 1 thread sitting at 90-100% much of the time? If my gpu is being HAMMERED, 100% full-time, nearly overheating (75-80deg, even when not really moving/doing anything), and NONE of my 8 cpu threads on my 6700K are ever going above 25-40%, max (save maybe a single 2 second spike on 1-2 threads at 80% somewhere), can that STILL be a cpu bottleneck if I'm not able to reach my target fps/resolution/settings?
 
Last edited:
Cpu bottleneck is any time your GPU isn't at 100% assuming you're not running at your framerate limit.

You should consider returning the 7800XT and seeing what the 4070 Super has to offer when it releases in January, since you're not installing the card yet anyway. I've not heard someone say that FSR looks better than DLSS before, that's a new one :D. Newer tested games with RT by default at every detail setting are really starting to show the difference between AMD/NVIDIA.

https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

For example the 7900XTX being about on par with a 3090 Ti, when it should be closer to the 4080.

Or the 7800 XT being beaten by a 3070 Ti.
Games are only going to move further along the road to RT on all the time, as its easier to do than custom light baking everything, and all the new engines use it by default.

4070 Ti is getting 104 FPS in avatar, 7800XT getting 60. 4070 Super should be fairly close to the Ti.

Bottom line is AMD is around a generation behind in RT, which is becoming the default.
 
Last edited:
I Mean ,even my i7 6700K @4.5Ghz was a bottleneck for my RTX 2070 Super. Example, in GTA V at some situations I'm getting double the fps now @1440p, with my i7 12700K @ 5Ghz...My 2070 Super got a new boost with my i7 12700K..
 
Cpu bottleneck is any time your GPU isn't at 100% assuming you're not running at your framerate limit.

You should consider returning the 7800XT and seeing what the 4070 Super has to offer when it releases in January, since you're not installing the card yet anyway. I've not heard someone say that FSR looks better than DLSS before, that's a new one :D. Newer tested games with RT by default at every detail setting are really starting to show the difference between AMD/NVIDIA.

https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

For example the 7900XTX being about on par with a 3090 Ti, when it should be closer to the 4080.

Or the 7800 XT being beaten by a 3070 Ti.
Games are only going to move further along the road to RT on all the time, as its easier to do than custom light baking everything, and all the new engines use it by default.

4070 Ti is getting 104 FPS in avatar, 7800XT getting 60. 4070 Super should be fairly close to the Ti.

Bottom line is AMD is around a generation behind in RT, which is becoming the default.
Hmm... well I check my stats literally after every gaming session. Almost NEVER see gpu not being hammered, 90-100%, unless it's a game that just doesn't require much 'juice' at all. Remember, I do NOT play big AAA games (at least not brand new ones - and I only have 2-3 from 3-5 years ago I'm interested in playing atm). I've never even seen my VRAM usage above 8gigs, and that was only a handful of times. Usually 5-6gigs is tops, even at 4k/ultra. I'm mostly an indie gamer (lots of indie AA games - maybe some 1.5A? lol). But there are games I VERY MUCH want to play at higher settings that it SEEMS to me it's my gpu that's having the difficulties. Like, cpu doesn't come close to breaking a sweat and gpu will be on fire (card seems like it's starting to show its age but in benchmarks, etc. it's still within a few % of any other 1080 ti). So I'm still a bit confused. I remember being told not even to get a 1080 Ti when I had my 1080, and that gave me a VERY noticeable boost universally. I'm just after a small one for now. Even 5-7fps at 4k in some of the titles I mentioned would make the diff between playing in vsync60 or not playing at all.

One thing I AM considering is holding on to the 7800 XT (even though I thought I made a good choice and seems like most people highly recommended it) for a week or two in case the 4070 Super seems like a much better choice. I mean, I have no problem staying with Nvidia. Would be easier for me, and I'm soused to Afterburner (not to mention control panel, inspector, nvcleaninstall, etc...). This card isn't intended to be a 5+ year upgrade, though. More like 1-2. But I definitely see your point about the Nvidia "supers" supposedly coming out end of January. I'm not sure if the 4070 super will be close enough to the Ti, but we'll see. They aren't changing the size of the VRAM or its speed from what I thought I read earlier. Honestly, if AMD's drivers and 'ecosystem', etc. were as good as Nvidia's then I would have ZERO reason to even consider Nvidia. The whole selling people on upscaling and ray tracing thing just doesn't really apply to me at all. I would MUCH rather have a faster 'traditional' card. Upscaling/Interpolation are icing, but not nearly as much for me as most other gamers. As for Avatar... I could have swore when I looked at the 30gpu benchmark on, i think Tom's, that the AMD cards kicked ass. DLSS3 frame generation isn't even offered in that game. Only FSR3. Clearly looked like an "AMD game" to me (just how there are Nvidia titles). Maybe that was a very early build and things have changed? But with only 12GB of VRAM on the 4070 Ti I'm not sure how it could do very well at very high settings, be it 4K OR 1440p (even with DLSS2 or FSR3 enabled to the max). I think they showed 16gb for 4k max and either JUTS under or JUST over 12GB for 1440.

As for FSR2 vs DLSS, I've seen plenty of people (and reviews/sites) say FSR2 looks better to them, and also that FSR2 gives (almost 'objectively') better picture quality when comparing a frame at a time (I realize this isn't the same as in motion :). From what I've seen they both look very similar in action, though, as long as you aren't trying to play with raytracing, but the VAST majority of the games I play will not have either. I have been very happy with what FSR2 does for my 1080 ti in the games that do have it, though. And it should be a bit better with a newer, faster, AMD card. I hear conflicting things regarding whether the new AMD cards actually have an advantage over other vendors when it comes to FSR2/3+, though (from what I can gather they DO but it's nothing major). But it'd be nice to have access to both dlss and fsr ;)

I read a few articles the other week regarding FSR's performance vs. DLSS, and I saved a few screenshots from the first one I came across. Zoom in and it's obvious FSR looks better when comparing single frames. DLSS is much blurrier. Zoom in and look at the rocks (like on the left side of the first picture) or the grass (same picture, in front). Also things like the tires. All details look better. The second article I came across called FSR the clear winner/king of picture upscaling (I guess meaning, again, on a frame vs frame basis). I know DLSS gives more frames when actually playing, tho, and maybe the sharper images don't translate to much when playing at 60fps? I'll look more into it. More for my brother than for myself, since he's more into AAA stuff than me (but uses consoles for those mostly).

EDIT: responded to some other things and linked 2 screenshots I saved from a review
 

Attachments

Last edited:
Hmm... well I check my stats literally after every gaming session. Almost NEVER see gpu not being hammered, 90-100%, unless it's a game that just doesn't require much 'juice' at all. Remember, I do NOT play big AAA games (at least not brand new ones - and I only have 2-3 from 3-5 years ago I'm interested in playing atm). I've never even seen my VRAM usage above 8gigs, and that was only a handful of times. Usually 5-6gigs is tops, even at 4k/ultra. I'm mostly an indie gamer (lots of indie AA games - maybe some 1.5A? lol). But there are games I VERY MUCH want to play at higher settings that it SEEMS to me it's my gpu that's having the difficulties. Like, cpu doesn't come close to breaking a sweat and gpu will be on fire (card seems like it's starting to show its age but in benchmarks, etc. it's still within a few % of any other 1080 ti). So I'm still a bit confused. I remember being told not even to get a 1080 Ti when I had my 1080, and that gave me a VERY noticeable boost universally.

One thing I AM considering is holding on to the 7800 XT (even though I thought I made a good choice and seems like most people highly recommended it) for a week or two in case the 4070 Super seems like a much better choice. I mean, I have no problem staying with Nvidia. Would be easier for me, and I'm soused to Afterburner (not to mention control panel, inspector, nvcleaninstall, etc...). This card isn't intended to be a 5+ year upgrade, though. More like 1-2. But I definitely see your point about the Nvidia "supers" supposedly coming out end of January. I'm not sure if the 4070 super will be close enough to the Ti, but we'll see. I could have swore when I looked at the 30gpu benchmark on, i think Tom's, that the AMD cards kicked ass. DLSS3 frame generation isn't even offered in that game. Only FSR3. They aren't changing the size of the VRAM or its speed from what I thought I read earlier. Honestly, if AMD's drivers and 'ecosystem', etc. were as good as Nvidia's then I would have ZERO reason to even consider Nvidia. The whole selling people on upscaling and ray tracing thing just doesn't really apply to me at all. I would MUCH rather have a faster 'traditional' card. Upscaling/Interpolation are icing, but not nearly as much for me as most other gamers.

As for FSR2 vs DLSS, I've seen plenty of people (and reviews/sites) say FSR2 looks better to them, and also that FSR2 gives (almost 'objectively') better picture quality when comparing a frame at a time (I realize this isn't the same as in motion :). From what I've seen they both look very similar in action, though, as long as you aren't trying to play with raytracing, but the VAST majority of the games I play will not have either. I have been very happy with what FSR2 does for my 1080 ti in the games that do have it, though. And it should be a bit better with a newer, faster, AMD card. I hear conflicting things regarding whether the new AMD cards actually have an advantage over other vendors when it comes to FSR2/3+, though (from what I can gather they DO but it's nothing major). But it'd be nice to have access to both dlss and fsr ;)

I read a few articles the other week regarding FSR's performance vs. DLSS, and I saved a few screenshots from the first one I came across. Zoom in and it's obvious FSR looks better when comparing single frames. DLSS is much blurrier. Zoom in and look at the rocks (like on the left side of the first picture) or the grass (same picture, in front). Also things like the tires. All details look better. The second article I came across called FSR the clear winner/king of picture upscaling (I guess meaning, again, on a frame vs frame basis). I know DLSS gives more frames when actually playing, tho, and maybe the sharper images don't translate to much when playing at 60fps? I'll look more into it. More for my brother than for myself, since he's more into AAA stuff than me (but uses consoles for those mostly).
Yeah people like to recommend them, in a vacuum frozen moment they might make sense, but like I said the industry is moving towards making RT exclusive games, as we see with Avatar. TPU reviews and comparisons do not show FSR as superior, in my own experience DLAA is unbeatable and Quality DLSS hard to differentiate from native much of the time.
 
Yeah people like to recommend them, in a vacuum frozen moment they might make sense, but like I said the industry is moving towards making RT exclusive games, as we see with Avatar. TPU reviews and comparisons do not show FSR as superior, in my own experience DLAA is unbeatable and Quality DLSS hard to differentiate from native much of the time.
I don't think you were "in" this thread at the start. I mentioned how one of the 3 cards I was interested (probably the MOST interested) in was a 3080 12GB (used, ~$500). What would you think about that compared to a 7800XT or a 4070 based on what I have said (keeping in mind I WILL be buying a gpu ahead of a platform upgrade, which will be in a few months and that this more of a "bridge' upgrade than a long-term one). Ignore the 4070 super, as depending on when it's released, price, etc. I may consider that anyway.

I see you're TPU staff, so I will trust you know much better than I.. saw your system specs and noticed you had a 3080 Ti, and the rest of your system looks insane. What is your next consideration for gpu?
 
Last edited:
only true if you consider a memory/cache bottleneck as part of cpu bottleneck
Both are subsystem of CPU so yes. X3D for example are just normal chip with more cache. Memory is basically just L4 cache.
. What is your next consideration for gpu?
I wouldn't really consider Ampere or RDNA2 with prices as they are. *Maybe* 6700XT if your budget is super tight, otherwise 4070 is basically where current gen starts making sense. 4060 Ti is OK but definitely a 1080p card and RDNA3 isn't compelling at all, with the 6800XT beating the 7800XT in several games and the 7900XT/XTX being pretty cringe performance wise when you use modern tech like RT etc. Essentially, I wouldn't pay more than $300 for a gpu that's bad at RT, since it's now the standard lighting in new games, and FSR is pretty bad, being beaten even by Intel with XeSS for IQ.

Bear in mind $600 4070 is basically equivalent to my $1000+ 3080 Ti, which is what they went for when I bought mine. I don't think it's expensive at all considering it's almost twice as efficient too. Supers should move that needle even more.

FG/DLSS3 is also very good in games that support it, but don't make a decision based on that, it's just a bonus.

Basically I wouldn't even consider anything lower than a 3080 for a new purchase.
performance-1920-1080.png


And 4K.
performance-3840-2160.png


the rest of your system looks insane
Thanks

I'll probably be moving to a 4080 S or a 4090 soon.
 
Friendly reminder that not everyone uses raytracing. Pushing that feature on people so they can buy green at a worse price/performance deal really shows your fanboyism when it's an option.
 
Friendly reminder that not everyone uses raytracing. Pushing that feature on people so they can buy green at a worse price/performance deal really shows your fanboyism when it's an option.
I was about to reply re-stating my total indifference to ray tracing and to "next-gen" titles in general. I DO NOT play them.

Here's a good question that will play directly into which card works best for me/us:

I can play RDR2 and God of War, Pathfinder WoTR, and TW:WH at 1440p 60 atm. Will going to a 7800XT (or 3080, or 4070) allow me to play at 4k/60 (again, with caveats here and there, mainly for TW:WH)? The one post-processing effect that gives me any problems (and it's quite a bit) is Screen Space Reflections, fwiw. I can also play next-gen Witcher3 at 4k locked 60 on 4k high with FSR2 balanced/quality currently. Would I be able to crank that up from high to Ultra and get the same FPS, even if it still requires some bit of help from FSR? (I know you can't guarantee anything, but these are all older games I am very keen to play on 4k/60, and almost everything else I play will not be anywhere near as demanding). These, and MAYBE CP2077 (I hesistate to mention this title due to nvidia/RT being much more attractive here) later down the line.

Oh yeah - and Baldur's Gate 3. This is the one big, NEW game both of us will be playing. I can get 4k/60/med-high with FSR1 (back when I played it that's all it had for FSR). I'd like to get 4k/60/ultra. No ray tracing. I also would like to try WoW again, even if it plays at 1440 in groups. If I can't handle it with this cpu, then oh well.

All else are indie titles that are not very demanding, do not have RT or upscaling as an option at all (a few MAY have FSR or even DLSS but certainly no RT), but I can't quite get the performance at 4k that I'd like (on the lower-end here would be octopath traveler 2,, to give you another example of a game included here, and it only dips here and there). I have quite a few titles that seem either poorly optimized or just are a BIT more demanding than my system can handle at 4k/max/60 (with 1440p being no problem). I only play on a 4k 55" TV, so 4K/60 is ALWAYS the target.

Again, Ray Tracing is NOT a real consideration for me and doesn't much play into my choice of card here. It's bells and whistles for the very few games I may play that have the option. I can guarantee you it will remain the case for the lifetime of this gpu upgrade.

I realize I could buy a 6750XT and probably get most of the "boosts" im after atm (if I'm NOT totally cpu-flubbed in those titles...), but this upgrade is meant to bridge between this build and my 'temp upgrade' PC (where I use a newer, but not flagship and probably used, cpu/mobo) until maybe 1.5-2years later when I do a full-on near-flagship rebuild. I want more than a 6750 XT for this ~2yr temp/bridge upgrade, hence why I'm considering what I'm considering. I'm looking for the beefiest upper-mid-range ($500-600) card I can get atm, that has the best capabilities in traditional, rasterized, performance. (hence why I say the 4070 isn't really a consideration compared to the stronger AMD options at this price point, although the 4070 Super MIGHT as long as it's RIGHT around the corner as I don't want to continue to put this off, I've already waited about 2 years longer than I wanted to and as I said before it's not more than a 2yr upgrade anyway. Plus I've already got the 2 ASRock Phantom Gaming 7800 XT's (not opened, and returnable until the end of January).

I may sound like I have my "mind made up" about certain things, but that's mostly just that I want a new gpu now, even though I'll still be on the older 6700K for a bit. The original question regarding the 7800 XT vs 3080 (used) vs 6950 (which is now out of my reach as it went back to $800) is still very much swirling around in my head (even now, after I thought I made a final decision, lol). I may just have to open the cards and test them out on our systems to see for myself, and if I truly CANNOT get an inch more out of my games on my current PC then I always have exchange options....)

EDIT: Oh yea, there is ONE title I want to try with ray tracing. I have yet to ever play the original Portal or Portal 2 (I own both), and I would be interested in playing THIS game with ray tracing. That's all.
 
Last edited:
Back
Top