• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 or 7900xtx

Status
Not open for further replies.
Hm, the 3070, 3080, 3080 Ti and the entire 2000 series beg to differ. Their features sure as heck did not help how quickly they were outmoded, particularly the 3070 which essentially became unable to do RT within 1 year and 10 months in new AAA titles.



This is a patently false statement:

View attachment 316183

The 7900 XTX is significantly faster than the 3080 in RT. Some people in this thread are greatly exaggerating Nvidia's benefits. Either card is a good choice.



As a person who recently upgraded from a 1080 Ti, the 11GB of VRAM the card had was absolutely critical to keeping the card going later in it's life. I was able to keep the texture settings at max thanks to that, meanwhile owners of even the newer 2080 had to dial down settings. If you plan on keeping your card for a few generations VRAM will absolutely play a role.

If you look at core count, TMUs, ROPs, ect the 4090 has 68% more resources at it's disposal. As games become more demanding and as faster CPUs allow the 4090 to stretch it's wings the 4090 will pull farther ahead. Really the current 4080 is a 4070 / 4060 Ti if you compare die size, Nvidia is giving you much less for your money compared to prior gens. There's nothign that can be done about that though, both AMD and Nvidia like to keep prices high.

Yeah lets see about that. 3070 8GB vs 6700XT 12GB which both launched at 499 and 479 dollars -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
3070 easily wins in 4K/UHD minimum fps ;) 4GB more VRAM did nothing for 6700XT. 3070 won on release and it wins today.

You can always find a game or two that will make a GPU drop to its knees, even 4090, so who cares? People adjust settings to get 60+ fps or even 100-120+ fps and doing this will lower VRAM usage on weaker GPUs anyway because they can't handle demanding games on max settings at high res anyway - Pointless to put tons of VRAM on a weak GPU :laugh:

1080 Ti has been slow for years. Today it is not even considered mid-end. You are not be maxing any demanding games at high res using a 1080 Ti regardless of VRAM - The chip itself is dated and weak, stop lying LMAO.


Techspot did 3080 vs 6800XT revisit in 2022 and 2023 and 3080 won by 6-7% at 4K/UHD -> https://www.techspot.com/review/2427-geforce-rtx-3080-vs-radeon-6800-xt/

VRAM is only important if the chip is fast enough to actually use it. Logic 101. And no chip will be considered fast in 5+ years, not even 4090 will be

1080 Ti has 11GB which is decent, but the chip is mediocre forcing you to lower settings anyway, hence lowering VRAM requirement and usage........

Cheap cards with alot of VRAM is not going to age well regardless. Chip is going to be the limiting factor. Again, look at 6700XT/6750XT. Look at 1080 Ti, look at 2080 Ti, look at 6800 16GB - None of them are considered fast today and won't be able to max out games at high res anyway

Or even look at 3090 24GB that is beaten by 4070 Ti 12GB in pretty much every game at 4K/UHD today, and 4070 Ti is not even aimed at 4K/UHD. At some point, yeah more than 12GB will be useful, but NONE OF THESE CARDS are going to run games on settings that REQUIRE MORE THAN 12GB when that is the case. How can you not understand this simple fact. Higher settings PUSH THE GPU AS WELL not only VRAM, LMAO!

I always laugh when people think they can futureproof their GPU with alot of VRAM... After 2-3 generations not even AMD/Nvidia is going to bother supporting the card anyway and you are better off buying a new mid-end card that will wreck that old "futureproofed" card with ease at half the power usage or even less :laugh:


It does not matter, all cards perform identically, the hellhound is just pre oced.

And no, the 4080 isn't 17% faster in RT, it's just that there are games that don't make much use of it. In heavy RT games, you know, those you actually want to enable it cause it looks nice the difference can definitely get higher than 50%.


You know why the 7900xtx is faster in farcry 6 even with RT? Cause the game doesn't have RT. All of these AMD sponsored games make as little use of RT as possible, to the point that it actually looks WORSE than running without RT. ALL of them. RESI village runs RT shadows at 1/4 the resolution. The only reason this is happening is to fool people into thinking nvidia and amd are close in RT performance. They are not.

Right on. AMD sponsored titles with RT are not using proper RT meaning reduced res around 1/4 to keep AMD GPUs from not dropping to insanely low levels of performance.

Just look at Cyberpunk Ray Tracing performane. 7900XTX loses to 3080 and 7900XTX is at like 4060 8GB level with Path Tracing enabled.

Games with ACTUAL PROPER RT and even PT are killing AMD GPUs, framerate tanks. Just like consoles are useless for RT even tho some games claim RT.

RDNA4 won't have high-end SKUs and RT perf won't improve
RDNA5 might be able to do RT, we will see in 2025+
 
Last edited:
First and foremost why in the world are you going back to older posts in this thread to pull people who stopped commenting specially to avoid the clown fiesta this thread has become? Clearly you are a glutton for punishment so I shall give it to you.

Yeah lets see about that. 3070 8GB vs 6700XT 12GB which both launched at 499 and 479 dollars -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
3070 easily wins in 4K/UHD minimum fps ;) 4GB more VRAM did nothing for 6700XT. 3070 won on release and it wins today.

My comment was in regards to RT performance and you linked minimums. Should be obvious given it's directly in the sentence you quoted but here you are, trying to provide minimum as a substitute for RT data. Here's the issue I was referring to:


1696519416537.png

1696519473246.png



In addition, The Last of Us, Halo Infinite, Hogwarts Legacy, and Forspoken all have texture swapping issues on the 3070 so visual quality in those games is simply worse than cards with more than 8GB of VRAM.

You can always find a game or two that will make a GPU drop to its knees, even 4090, so who cares? People adjust settings to get 60+ fps or even 100-120+ fps and doing this will lower VRAM usage on weaker GPUs anyway because they can't handle demanding games on max settings at high res anyway - Pointless to put tons of VRAM on a weak GPU :laugh:

As shown above, it's more than 2 games. This is a misleading argument because it implies that VRAM requirements aren't increasing for games.

1080 Ti has been slow for years. Today it is not even considered mid-end. You are not be maxing any demanding games at high res using a 1080 Ti regardless of VRAM - The chip itself is dated and weak, stop lying LMAO.

First, no one said anything about high-res (high-res of which could mean a variety of things depending on who you ask). That's a stipulation you seemed to have assumed and added yourself.

Second, yes you can absolutely max out graphics settings in modern games thanks to it's VRAM size:


I merely stated that I was able to keep texture settings at max and the video above proves that. You come in here calling me a lair, to say that's a hyperbolic response would be accurate. Either you completely lack social decorum or you are venting at random people on the internet. I'll defer to hanlon's razer in this instance.

This thread is completely off topic and turned it into a clown fiesta. You aren't on topic and you are just being hyperbolic. This is my last comment here, good day.
 
Last edited:
First and foremost why in the world are you going back to older posts in this thread to pull people who stopped commenting specially to avoid the clown fiesta this thread has become? Clearly you are a glutton for punishment so I shall give it to you.



My comment was in regards to RT performance and you linked minimums. Should be obvious given it's directly in the sentence you quoted but here you are, trying to provide minimum as a substitute for RT data. Here's the issue I was referring to:


View attachment 316294
View attachment 316295


In addition, The Last of Us, Halo Infinite, Hogwarts Legacy, and Forspoken all have texture swapping issues on the 3070 so visual quality in those games is simply worse than cards with more than 8GB of VRAM.



As shown above, it's more than 2 games. This is a misleading argument because it implies that VRAM requirements aren't increasing for games.



First, no one said anything about high-res (high-res of which could mean a variety of things depending on who you ask). That's a stipulation you seemed to have assumed and added yourself.

Second, yes you can absolutely max out graphics settings in modern games thanks to it's VRAM size:


I merely stated that I was able to keep texture settings at max and the video above proves that. You come in here calling me a lair, to say that's a hyperbolic response would be accurate. Either you completely lack social decorum or you are venting at random people on the internet. I'll defer to hanlon's razer in this instance.

This thread is completely off topic and turned it into a clown fiesta. You aren't on topic and you are just being hyperbolic. This is my last comment here, good day.
So the 4060ti 16gb is better than the 7700xt because of the extra vram? It will age better, allowing you to select high quality textures when the 12gb of the 7700xt just won't cut it anymore, right?
 
So the 4060ti 16gb is better than the 7700xt because of the extra vram? It will age better, allowing you to select high quality textures when the 12gb of the 7700xt just won't cut it anymore, right?

Yes, theoretically right. 12 GB << 16 GB, so a card with 16 GB should behave better if it doesn't run into a shaders / ROPs / memory throughput bottleneck first!
 
So the 4060ti 16gb is better than the 7700xt because of the extra vram? It will age better, allowing you to select high quality textures when the 12gb of the 7700xt just won't cut it anymore, right?

I think the 12GB vs 16GB debate really comes down to nobody really knowing when 12GB isn't going to be enough for modern games and while I don't think anyone should be buying 350+ 8GB GPUs blowing a couple terrible console ports out of proportion isn't really a solid argument. The 4060ti should be avoided regardless of how much vram they throw at it regardless although the 7700XT should as well because it doesn't make sense vs the 7800XT.

16GB will be enough at least through the next console generation and likely 2-3 years after that, as much as PC gamers want to live in denial consoles really dictate a min level of hardware someone should target and as long as Xbox Series S exist games will run fine on anything with at least 10-12GB for the foreseeable future other than the 2-3 terrible console ports we get every year.


This was posted already and I mean he says if the cards are within 15% buy the 4080 over the 7900XTX.... Although my personal gap is 20%

Although I would defer to him someone who has used both cards extensively. I can only personally simulate these cards in my head one the 4080 being 30% slower than my 4090 and one the 7900XTX that can't even beat my nearly 3 year old 3080ti in every game with RT unless it only uses console level RT.

Although For anyone who COD is their Jam I always recommend RDNA3 over ADA there are actually a lot of people who only play this game randomly.
 
Last edited:
One of the things that happens in these threads is Nvidia users telling AMD users how bad their Cards are vs what the OP is questioning. The kicker is they will use Day one reviews to support their opinion. I can tell anyone that the 7900XT is fine for 4K and is the best priced 4K capable (In all Games) in Canada. GPUs are still expensive here so a 6700XT is still $470. Then you look and 6800XTs are still $750 at the lowest. The cheapest 7900XT is about $1100. The 4080 is $1600 and more expensive than the 7900XTX at $1250 from $1600 at launch. Whether people believe it or not VRAM matters. The fact that this is all math means that the buffer that is in the GPU will always be better than having to use system RAM. The problem for me with Nvidia (you will never see me bash NVidia personally) is cost. There is no way I could explain to my wife how a 4090 for $2400 for something she cannot appreciate.

I also have an over 840 Game library so I prefer raw performance over features. Now here is the newest reason to get the 7900XTX. The latest driver from AMD for 7000 cards is absolutely spectacular. The software on a per Game basis applies settings from the software package. I will use Forspoken (Not an actual bad Game) and that sees about a 30% improvement in Frames but I can also see Micro stutter rate, 99% Fps, Frame Gen Lag and Frame Time. The fact is AMD is quietly making their cards rock solid and have always remained more relevant than Nvidia for a cheaper price.
 
One of the things that happens in these threads is Nvidia users telling AMD users how bad their Cards are vs what the OP is questioning. The kicker is they will use Day one reviews to support their opinion. I can tell anyone that the 7900XT is fine for 4K and is the best priced 4K capable (In all Games) in Canada. GPUs are still expensive here so a 6700XT is still $470. Then you look and 6800XTs are still $750 at the lowest. The cheapest 7900XT is about $1100. The 4080 is $1600 and more expensive than the 7900XTX at $1250 from $1600 at launch. Whether people believe it or not VRAM matters. The fact that this is all math means that the buffer that is in the GPU will always be better than having to use system RAM. The problem for me with Nvidia (you will never see me bash NVidia personally) is cost. There is no way I could explain to my wife how a 4090 for $2400 for something she cannot appreciate.

I also have an over 840 Game library so I prefer raw performance over features. Now here is the newest reason to get the 7900XTX. The latest driver from AMD for 7000 cards is absolutely spectacular. The software on a per Game basis applies settings from the software package. I will use Forspoken (Not an actual bad Game) and that sees about a 30% improvement in Frames but I can also see Micro stutter rate, 99% Fps, Frame Gen Lag and Frame Time. The fact is AMD is quietly making their cards rock solid and have always remained more relevant than Nvidia for a cheaper price.

We had two 7900XTX users saying the experience sucks buy the 4080 as well.... But they don't count. Also the reviewer using both cards in the hub video says buy the 4080..... But that also doesn't count he's only been doing it for 20+ years but what does he know.....
 
Last edited:
The 1% lows with RX 7900 XTX are terrible. :banghead: :eek: :kookoo:

Game is Star Wars Jedi Survivor, relative performance, average fps.
1696525858115.png


 
We had two 7900XTX users saying the experience sucks buy the 4080 as well.... But they don't count. Also the reviewer using both cards in the video says buy the 4080..... But that also doesn't count.

Sure it counts, but there are also a bunch of people happy with the AMD cards. In the end it comes down to what performance you want, what features you need and how much you want to/can spend. The shorthand is essentially if you want RT or use any kind of GPGPU stuff, Nvidia. If you don't care for that, why not AMD.
 
Sure it counts, but there are also a bunch of people happy with the AMD cards. In the end it comes down to what performance you want, what features you need and how much you want to/can spend. The shorthand is essentially if you want RT or use any kind of GPGPU stuff, Nvidia. If you don't care for that, why not AMD.

100% and their views of the hardware are just as important same with all the ada owners.

And really we will never have a mostly civil conversation between AMD/Nvidia users buy like I said before these gpus are pretty easy to decide between at the prices the OP listed.

I only have a 4090 and 4070 to compare the 4080 to and only a 6700XT as my most recent amd card. I like AMD cards when they are much cheaper and offer somthing clearly better in the here and now like I would never buy a 3060/3050/4060/4060ti over a 6700XT/6750XT becuase I don't believe in 8GB gpu's does that make my choice right for everyone no but the things Nvidia is good at DLSS/RT I would never use on a sub 500 usd gpu that I would target 1080p with anyway.

The only current generation gpus I feel are a toss up are the 7800XT vs the 4070 and the 7900XT vs the 4070ti but I'd still want the amd cards to be decently cheaper at least 10-15%

But I'm just one person some wouldn't buy Nvidia regardless of how much better it is vs amd or vice versa and that is fine at the end of the day it is our own hard money coming out of our pocket easy to recommend gpus one over the other when we aren't actually paying for them.
 
Last edited:
In the end it comes down to what performance you want, what features you need and how much you want to/can spend.

It is about what image quality you prefer. I still remain confident that nvidia's images are blurred, the colours washed out, and the texture resolution is lower.
On the left is AMD Radeon RX 7900 XTX, on the right is nvidia rtx 4080:

1696527949439.png
 
It is about what image quality you prefer. I still remain confident that nvidia's images are blurred, the colours washed out, and the texture resolution is lower.
On the left is AMD Radeon RX 7900 XTX, on the right is nvidia rtx 4080:

View attachment 316314

Not the best image for comparison. try something brighter with more color
 
It is about what image quality you prefer. I still remain confident that nvidia's images are blurred, the colours washed out, and the texture resolution is lower.
On the left is AMD Radeon RX 7900 XTX, on the right is nvidia rtx 4080:

View attachment 316314

This sounds more like video compression artifacting to me than anything else. You're not seriously pulling images off an youtube video and trying to compare are you?

This thing that "radeon has better image quality, nvidia cheats, etc." has always been patently false and some speculation AMD fans have fueled over the years.

OP, if you value your sanity, buy the 4080. The drivers on the 7900 XTX are enough to make anyone who's not emotionally invested in AMD or have hours upon hours to troubleshoot problems churn. And don't expect them to fix anything you may find, because they won't.
 
It is about what image quality you prefer. I still remain confident that nvidia's images are blurred, the colours washed out, and the texture resolution is lower.
On the left is AMD Radeon RX 7900 XTX, on the right is nvidia rtx 4080:

View attachment 316314

I mean ... sure? If that is what you care about put it into the Performance or Feature metric. I'm probably quite blessed with my ability to not notice such things.
 
We had two 7900XTX users saying the experience sucks buy the 4080 as well.... But they don't count. Also the reviewer using both cards in the hub video says buy the 4080..... But that also doesn't count he's only been doing it for 20+ years but what does he know.....
Lmao, that is a reference point now? Put down the fanboy man.
 
I mean ... sure? If that is what you care about put it into the Performance or Feature metric. I'm probably quite blessed with my ability to not notice such things.

I can't use an nvidia card exactly because of this. Any time I put such a card in any of my PC builds, I get annoyed by the different image quality - too bright, too washed out. It's very difficult to adjust that even if I tinker in the nvidia control panel...
 
Lmao, that is a reference point now? Put down the fanboy man.

Why wouldn't those be references? Look up as much as possible and base your purchase on that, that is the only way to do it. In a perfect world we could just look at a single graph and that would be everything we needed, but that is not how things are. It has never been so annoying to buy GPUs as now.
 
Why wouldn't those be references? Look up as much as possible and base your purchase on that, that is the only way to do it. In a perfect world we could just look at a single graph and that would be everything we needed, but that is not how things are. It has never been so annoying to buy GPUs as now.
I installed one posted it was w/o issue. Is that not a reference point? Da fuck. Yall only use what supports yalls agenda.
 
Again - left is Radeon, right is geforce:

View attachment 316317



Yeah, obviously the right hand side artifacts more than the left hand side?

News flash: lossy video source is not stable or comparable. Even attempting to compare this is rubbish. I honestly don't understand why AMD fans need to keep on perpetuating lies like this, it just gives them all a bad name. Try to look for such imperfections on an image extracted from my 4080's framebuffer. Good luck.

1693985642.png



I can't use an nvidia card exactly because of this. Any time I put such a card in any of my PC builds, I get annoyed by the different image quality - too bright, too washed out. It's very difficult to adjust that even if I tinker in the nvidia control panel...

Again, this is 100% and purely personal bias. This does not occur, it's patently false. Nvidia does not have poorer image quality, it does not display differently from AMD whatsoever.
 
I installed one posted it was w/o issue. Is that not a reference point? Da fuck. Yall only use what supports yalls agenda.

I genuinely don't know what you're trying to say.
 
I genuinely don't know what you're trying to say.
Post I quoted only referenced two recent problem threads yet ignores the counts of users with this card in question who had no issues. /smh

Threads like these only serve to allow fanboys to push agendas. Thread was answered in the first like 5 posts lol.
 
Post I quoted only referenced two recent problem threads yet ignores the counts of users with this card in question who had no issues. /smh

Ahh. Ok. One of the threads at least was mostly about how AMD wasn't good at RT, and that is a VERY strong argument against AMD if you want to use RT.
 
The fanboiness, straw-grasping and cherry picking of some here is really cringe-worthy.
 
Status
Not open for further replies.
Back
Top