• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

~$750-800 GPU : RTX 4070 Ti vs. RX 7900 XT for Gaming

Which is the better card for the price?


  • Total voters
    85
Status
Not open for further replies.
My xtx used to either flickers the screen randomly with freesync on ,or when it's off,drawing 100watts idle power.These issues were fixed but its really worrying at the time.By contrast, I never seen so many little glitches with my old 3070.
Yep last update was very decent indeed!


Isn't what you're really saying more a case of Nvidia's hardware more readily integrated/supported in software? Because I don't disagree there, and it happens on the smaller things you can come across. An example, I use Wonderdraft, its a small tabletop RPG tool to make maps with. On the 7900XT it runs in OGL at 50 FPS. On Nvidia, I had monitor refresh.

That's really just an aspect of market share. But its reality nonetheless. However, I wouldn't attribute it to driver 'quality'. When it comes to that, having made the switch recently, I can say AMD has its driver and its GUI to tweak things in a good order, equal and in some aspects better than what Nvidia has currently. GFE is still a hot mess, and the NVCP is still Windows 95 style; while Adrenalin just centers and orders it all nicely in one app. Let's give credit where it's due... Also the quality gap wrt bugs and stability, I don't recognize at all. Not yet, at least.

I’d go for the full fat 7900xtx if you can nab it on a sale or stretch the budget a bit more
yeah, xtx is of way better value
 
Get the 7900 XT on a super sale or go for the XTX version.
 
7900 XT without a doubt. I wouldn't get a more expensive 12GB card than 6750 XT myself these days.
 
7900 XT without a doubt. I wouldn't get a more expensive 12GB card than 6750 XT myself these days.
The 6750 XT is a great 1440p GPU that can also push 4K if need be. Comparing it to my 6700 XT that can play anything I throw at it with high PQ settings at 3440 x 1440.
 
The 6750 XT is a great 1440p GPU that can also push 4K if need be. Comparing it to my 6700 XT that can play anything I throw at it with high PQ settings at 3440 x 1440.
I have also a 6700 XT, I play at 4K. 6750 XT is just practically a higher clocked 6700 XT so there's not that much difference.

Also having a Freesync monitor makes those random dips below 60fps ok as it doesn't bother too much.
 
Sorry, but I would stick with NVidia:
- AMD drivers and older games can still be problematic
- power consumption
- GPU computing on AMD is oddly limited for non-"professional" cards
- CUDA is more common than portable compute stacks, sadly
 
Sorry, but I would stick with NVidia:
- AMD drivers and older games can still be problematic
I would like to know more about this since I haven't got a single problem with my 6700 XT.
 
I would like to know more about this since I haven't got a single problem with my 6700 XT.

Combat Mission x2 is still problematic. AMD recently fixed one issue (white flashes on explosions), but apparently incorrect error about being out of video RAM continue. Combat Mission v1 still doesn't have fog I guess (doesn't baldur's gate 2 have the same problem?).

I can't really give complete lists since wargamers (including me) tend to buy NVidia, so the current state of AMD drivers for games known to be problematic is largely unknown.

There was a recent large change to the AMD Windows drivers that made loading much faster but left some things behind.
 
Combat Mission x2 is still problematic. AMD recently fixed one issue (white flashes on explosions), but apparently incorrect error about being out of video RAM continue. Combat Mission v1 still doesn't have fog I guess (doesn't baldur's gate 2 have the same problem?).

I can't really give complete lists since wargamers (including me) tend to buy NVidia, so the current state of AMD drivers for games known to be problematic is largely unknown.

There was a recent large change to the AMD Windows drivers that made loading much faster but left some things behind.
Weird as I can run today's games and 20yr old ones without problems. Not that long ago when I finished Far Cry with 6700 XT.
 
Weird as I can run today's games and 20yr old ones without problems. Not that long ago when I finished Far Cry with 6700 XT.

I admit that it is hard for me to stay up-to-date with what works on AMD since I haven't bought an AMD card in a few years. But I do play CMx2 and it has problems right now.
 
Sorry, but I would stick with NVidia:
- AMD drivers and older games can still be problematic
- power consumption
- GPU computing on AMD is oddly limited for non-"professional" cards
- CUDA is more common than portable compute stacks, sadly
Maybe 15 years ago, but not today.
AMD is more than capable nowadays.
But to each their own I suppose.
 
I have also a 6700 XT, I play at 4K. 6750 XT is just practically a higher clocked 6700 XT so there's not that much difference.

Also having a Freesync monitor makes those random dips below 60fps ok as it doesn't bother too much.
lool on what setting do you play? ultra low..I had 6800 and never considered it 4k gpu
 
I just went through this dilemma for an egpu setup with GPD Win 4 + Razer Core X. I first bought a Sapphire Pulse 7900 XT for all AMD set up. But I had issues with black screen during both Windows 10/11 boot up with enclosure connected. I tried everything. I think the drivers were conflicting with 6800u igpu (both AMD for Gods sakes!) Also, it caused shader load stutters when playing Everspace 2 (game of the hour). I think AMD generally has issues with open world games with lots of assets popping in an out, in case of Everspace 2, ships are jumping in and out of the map. It also ran very very hot and loud.

I returned the AMD and bought a Asus Pro Art 4070 ti afterwards. The aforementioned issues are non-existent.

This is my second time trying out AMD GPU, but for some reason random issues have caused me to return and settle with an Nvidia card. It always has an issue with that one particular game I am playing atm.
 
I don’t think there’s anything worse than this, when someone buys your product despite the general trend and still goes back to the competitor.
Btw is there anyone that went the opposite way? From green to red?
 
I admit that it is hard for me to stay up-to-date with what works on AMD since I haven't bought an AMD card in a few years. But I do play CMx2 and it has problems right now.
Divine divinity, add that to the list of not working, at least with rdna2. You get black boxes around every object.

I don’t think there’s anything worse than this, when someone buys your product despite the general trend and still goes back to the competitor.
Btw is there anyone that went the opposite way? From green to red?
What I find fascinating is that amd wins all popularity votes, every single time, no matter what. Yet when it comes to actual sales....
 
I don’t think there’s anything worse than this, when someone buys your product despite the general trend and still goes back to the competitor.
Btw is there anyone that went the opposite way? From green to red?
I did, but not the way described here. I just upgraded from a 1080 to RDNA3. No complaints, really, but its not 'perfect' either, I'll take minor imperfections for granted though if that means having some meaningful slab of silicon for my money. The alternative being a 12GB option makes things pretty easy, I still think RDNA3 is going to age a lot better. We're already seeing the signs. We're also seeing the arch excel on Unreal Engine 5.x and non proprietary RT implementations.

As it always is... patience is a virtue. And I'm not missing a single Nvidia 'feature' honestly. DLSS? I just nuke all content with >70 FPS on native, whatever. The tables have turned, Nvidia is no longer set and forget which it used to be, you're tweaking right out of the box and per game if you really want to extract the 'fantastic features'. RDNA3 was set and forget, and it chews up anything in doing so. The baseline performance is just solid all-round and that's really all I need.
 
Last edited:
I did, but not the way described here. I just upgraded from a 1080 to RDNA3. No complaints, really, but its not 'perfect' either, I'll take minor imperfections for granted though if that means having some meaningful slab of silicon for my money. The alternative being a 12GB option makes things pretty easy, I still think RDNA3 is going to age a lot better. We're already seeing the signs. We're also seeing the arch excel on Unreal Engine 5.x and non proprietary RT implementations.

As it always is... patience is a virtue. And I'm not missing a single Nvidia 'feature' honestly. DLSS? I just nuke all content with >70 FPS on native, whatever. The tables have turned, Nvidia is no longer set and forget which it used to be, you're tweaking right out of the box and per game if you really want to extract the 'fantastic features'. RDNA3 was set and forget, and it chews up anything in doing so. The baseline performance is just solid all-round and that's really all I need.
I have a 4090 and I use dlss. I can definitely not nuke all content with over 70 fps. Maybe I need to upgrade to your card
 
Upgrading is different and it is normal to consider all the available options.
I upgraded back then to 5700XT coming from a nVidia card.
The sidegrade because of issues seems to be one way though. From red to green.
 
I have a 4090 and I use dlss. I can definitely not nuke all content with over 70 fps. Maybe I need to upgrade to your card
The key difference is probably that I'm not playing graphics poster childs before truly good games ;) I also have no desire to find the settings that will kill performance, like RT/PT. I play games to play games, not gape at whether a pixel is the right color.

You figured it out, even 1500 dollars of card won't give you infinite perf, gaming was always about having 'enough' for the time frame and games you play.
 
The key difference is probably that I'm not playing graphics poster childs before truly good games ;)
Me neither honestly. And although starfield doesn't fit into either category, I drop to 45 fps on native 4k.

But that's besides the point regardless. If I have fps to spare in a game, I don't play on native, I play with dldsr + dlss. Much better image quality. There isn't a single reason to play natively anymore if you have nvidia.
 
Me neither honestly. And although starfield doesn't fit into either category, I drop to 45 fps on native 4k.

But that's besides the point regardless. If I have fps to spare in a game, I don't play on native, I play with dldsr + dlss. Much better image quality. There isn't a single reason to play natively anymore if you have nvidia.
Starfield, the great equalizer lol
 
For pure raster and nothing else (meaning no features enabled) in popular titles, 7900XT is good, mostly for 1440p I'd say. 4K/UHD gaming is stretching it with a GPU of this caliber. Depending on games obviously. Both can do 4K/UHD in many games. The most demanding games tho, will require upscaling if you don't settle with lower settings or fps.

I'd probably take 4070 Ti myself, because I am not looking at raster performance only, this is 2023. I would hate not to have DLSS/DLAA/DLDSR/ShadowPlay/CUDA/Reflex and usable RT + Better drivers (meaning better over-all performance across many games, including lesser popular ones and early access titles especially + emulators)

Looking at pure raster tho, and minimum fps, they are very close -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
7900XT wins by 4% in 1440p and 6.5% in 4K/UHD. However none of these cards are true 4K cards IMO. More like 1440p and 3440x1440 solutions, without upscaling that is, and DLSS beats FSR every time.

AMD spends most of their time optimizing games that gets benchmarked alot, if/when you play lesser popular games and early access games on AMD GPU, performance is often wonky. Most developers also test and optimize using Nvidia because the majority of PC gamers use Nvidia (like 80-85% according to Steam Hardware Survey). Also Nvidia always have launch drivers for new games, AMD don't. This is generally why Nvidia sells better. Alot of people had a bad experience i with an AMD GPU ealier. AMD still struggles with bad rep in general.

I'd take all those RTX features any day over a ~5% lead in raster. AMD really needs to improve their feature-set if they want to compete. Most people buying high-end cards don't just look at raster today. I sure don't. I use some form of RTX feature in pretty much all new games today. Absolutely love DLAA. Best AA method by far today.

So yeah, for 1440p gaming, I'd go with 4070 Ti for sure unless you play some specific game all the time that runs better on AMD.

In 1440p, 4070 Ti literally gives you 3090 Ti performance with ~50% less power usage and 12GB VRAM is more than enough for 1440p. 3090 Ti was released 9-10 months earlier with a price tag of 1999 dollars.
Very few games even use 8GB in 1440p and games don't jump in VRAM requirement anytime soon.

In the end it depends on local prices and sales + maybe game bundles. 4070 Ti outsold 7900XT very easily worldwide. About 4-5 times as many, according to Steam.

Looking at "raw specs" is pointless when architecture is completely different. Look at performance, features and power, 4070 Ti is a great option overall and it would be my choice for sure for solid 1440p high refresh rate gaming.
 
Last edited:
For pure raster and nothing else (meaning no features enabled) in popular titles, 7900XT is good, mostly for 1440p I'd say. 4K/UHD gaming is stretching it with a GPU of this caliber. Depending on games obviously. Both can do 4K/UHD in many games. The most demanding games tho, will require upscaling if you don't settle with lower settings or fps.

I'd probably take 4070 Ti myself, because I am not looking at raster performance only, this is 2023. I would hate not to have DLSS/DLAA/DLDSR/ShadowPlay/CUDA/Reflex and usable RT + Better drivers (meaning better over-all performance across many games, including lesser popular ones and early access titles especially + emulators)

Looking at pure raster tho, and minimum fps, they are very close -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
7900XT wins by 4% in 1440p and 6.5% in 4K/UHD. However none of these cards are true 4K cards IMO. More like 1440p and 3440x1440 solutions, without upscaling that is, and DLSS beats FSR every time.

AMD spends most of their time optimizing games that gets benchmarked alot, if/when you play lesser popular games and early access games on AMD GPU, performance is often wonky. Most developers also test and optimize using Nvidia because the majority of PC gamers use Nvidia (like 80-85% according to Steam Hardware Survey). Also Nvidia always have launch drivers for new games, AMD don't. This is generally why Nvidia sells better. Alot of people had a bad experience i with an AMD GPU ealier. AMD still struggles with bad rep in general.

I'd take all those RTX features any day over a ~5% lead in raster. AMD really needs to improve their feature-set if they want to compete. Most people buying high-end cards don't just look at raster today. I sure don't. I use some form of RTX feature in pretty much all new games today. Absolutely love DLAA. Best AA method by far today.

So yeah, for 1440p gaming, I'd go with 4070 Ti for sure unless you play some specific game all the time that runs better on AMD.

In 1440p, 4070 Ti literally gives you 3090 Ti performance with ~50% less power usage and 12GB VRAM is more than enough for 1440p. 3090 Ti was released 9-10 months earlier with a price tag of 1999 dollars.
Very few games even use 8GB in 1440p and games don't jump in VRAM requirement anytime soon.

In the end it depends on local prices and sales + maybe game bundles. 4070 Ti outsold 7900XT very easily worldwide. About 4-5 times as many, according to Steam.

Looking at "raw specs" is pointless when architecture is completely different. Look at performance, features and power, 4070 Ti is a great option overall and it would be my choice for sure for solid 1440p high refresh rate gaming.
A 7900XT is fine for 4K regardless of what you think.
 
A 7900XT is fine for 4K regardless of what you think.
Nah it's not for me + no upscaling to save you because FSR is inferior to DLSS.
 
Last edited:
Status
Not open for further replies.
Back
Top