- Joined
- Nov 14, 2012
- Messages
- 1,693 (0.37/day)
System Name | Meh |
---|---|
Processor | 7800X3D |
Motherboard | MSI X670E Tomahawk |
Cooling | Thermalright Phantom Spirit |
Memory | 32GB G.Skill @ 6000/CL30 |
Video Card(s) | Gainward RTX 4090 Phantom / Undervolt + OC |
Storage | Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server |
Display(s) | 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR |
Case | Fractal Design North XL |
Audio Device(s) | FiiO DAC |
Power Supply | Corsair RM1000x / Native 12VHPWR |
Mouse | Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro |
Keyboard | Corsair K60 Pro / MX Low Profile Speed |
Software | Windows 10 Pro x64 |
Hm, the 3070, 3080, 3080 Ti and the entire 2000 series beg to differ. Their features sure as heck did not help how quickly they were outmoded, particularly the 3070 which essentially became unable to do RT within 1 year and 10 months in new AAA titles.
This is a patently false statement:
View attachment 316183
The 7900 XTX is significantly faster than the 3080 in RT. Some people in this thread are greatly exaggerating Nvidia's benefits. Either card is a good choice.
As a person who recently upgraded from a 1080 Ti, the 11GB of VRAM the card had was absolutely critical to keeping the card going later in it's life. I was able to keep the texture settings at max thanks to that, meanwhile owners of even the newer 2080 had to dial down settings. If you plan on keeping your card for a few generations VRAM will absolutely play a role.
If you look at core count, TMUs, ROPs, ect the 4090 has 68% more resources at it's disposal. As games become more demanding and as faster CPUs allow the 4090 to stretch it's wings the 4090 will pull farther ahead. Really the current 4080 is a 4070 / 4060 Ti if you compare die size, Nvidia is giving you much less for your money compared to prior gens. There's nothign that can be done about that though, both AMD and Nvidia like to keep prices high.
Yeah lets see about that. 3070 8GB vs 6700XT 12GB which both launched at 499 and 479 dollars -> https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
3070 easily wins in 4K/UHD minimum fps

You can always find a game or two that will make a GPU drop to its knees, even 4090, so who cares? People adjust settings to get 60+ fps or even 100-120+ fps and doing this will lower VRAM usage on weaker GPUs anyway because they can't handle demanding games on max settings at high res anyway - Pointless to put tons of VRAM on a weak GPU

1080 Ti has been slow for years. Today it is not even considered mid-end. You are not be maxing any demanding games at high res using a 1080 Ti regardless of VRAM - The chip itself is dated and weak, stop lying LMAO.
Techspot did 3080 vs 6800XT revisit in 2022 and 2023 and 3080 won by 6-7% at 4K/UHD -> https://www.techspot.com/review/2427-geforce-rtx-3080-vs-radeon-6800-xt/
VRAM is only important if the chip is fast enough to actually use it. Logic 101. And no chip will be considered fast in 5+ years, not even 4090 will be
1080 Ti has 11GB which is decent, but the chip is mediocre forcing you to lower settings anyway, hence lowering VRAM requirement and usage........
Cheap cards with alot of VRAM is not going to age well regardless. Chip is going to be the limiting factor. Again, look at 6700XT/6750XT. Look at 1080 Ti, look at 2080 Ti, look at 6800 16GB - None of them are considered fast today and won't be able to max out games at high res anyway
Or even look at 3090 24GB that is beaten by 4070 Ti 12GB in pretty much every game at 4K/UHD today, and 4070 Ti is not even aimed at 4K/UHD. At some point, yeah more than 12GB will be useful, but NONE OF THESE CARDS are going to run games on settings that REQUIRE MORE THAN 12GB when that is the case. How can you not understand this simple fact. Higher settings PUSH THE GPU AS WELL not only VRAM, LMAO!
I always laugh when people think they can futureproof their GPU with alot of VRAM... After 2-3 generations not even AMD/Nvidia is going to bother supporting the card anyway and you are better off buying a new mid-end card that will wreck that old "futureproofed" card with ease at half the power usage or even less

It does not matter, all cards perform identically, the hellhound is just pre oced.
And no, the 4080 isn't 17% faster in RT, it's just that there are games that don't make much use of it. In heavy RT games, you know, those you actually want to enable it cause it looks nice the difference can definitely get higher than 50%.
You know why the 7900xtx is faster in farcry 6 even with RT? Cause the game doesn't have RT. All of these AMD sponsored games make as little use of RT as possible, to the point that it actually looks WORSE than running without RT. ALL of them. RESI village runs RT shadows at 1/4 the resolution. The only reason this is happening is to fool people into thinking nvidia and amd are close in RT performance. They are not.
Right on. AMD sponsored titles with RT are not using proper RT meaning reduced res around 1/4 to keep AMD GPUs from not dropping to insanely low levels of performance.
Just look at Cyberpunk Ray Tracing performane. 7900XTX loses to 3080 and 7900XTX is at like 4060 8GB level with Path Tracing enabled.
Games with ACTUAL PROPER RT and even PT are killing AMD GPUs, framerate tanks. Just like consoles are useless for RT even tho some games claim RT.
RDNA4 won't have high-end SKUs and RT perf won't improve
RDNA5 might be able to do RT, we will see in 2025+
Last edited: