• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's RDNA 4 GPUs Could Stick with 18 Gbps GDDR6 Memory

It depends on what you are looking for a GPU to be able perform well on. Most gamers today are on the ray tracing bandwagon and the Nvidia GPUs are superior to the AMD GPUs in that respect.

No mention of ray tracing was made in his original comment. None at all. I've no idea why you are trying to change the goal posts for him but you should let him make his own logically fallacies instead of perpetrating them yourself.

Most gamers are in fact not on the ray tracing bandwagon. Both HWUB and GamersNexus did a poll and less than 30% of enthusiasts consider ray tracing an essential factor when purchasing a video card. That's just among enthusiasts as well, the actual rate at which ordinary people care is without a doubt even lower (especially given the average PC gamer doesn't even have a card that can use RT at all or can't do so at an acceptable FPS given the cost of well performing RT capable cards). The idea that everyone is on the RT bandwagon is completely incorrect.

I'd also add that even if his argument was that AMD is embarrassing performance wise due to it's RT performance (which again wasn't his original argument and isn't even mentioned in his post. It's just something you fabricated to try and defend his comment for some unknown reason), the 7900 XTX is only 17% slower in RT performance. That's not what I call embarrassing, not by a long-shot.

1714411956054.png


Neither the original argument or the goal post shifted argument 'he must be referring to RT performance' have any ground to stand on. Utter tripe. I don't see the point in jumping in the line of fire for an argument that was almost certainly made with disingenuous intent.
 
Most gamers are in fact not on the ray tracing bandwagon. Both HWUB and GamersNexus did a poll and less than 30% of enthusiasts consider ray tracing an essential factor when purchasing a video card.
Would you be willing to provide a link to those polls?
 
Would you be willing to provide a link to those polls?

Not the exact polls I was referencing but I was able to pull up some that I've voted on in the past via my history:

1714416797051.png

1714416816856.png


1714416938143.png



According to the GN poll 31% of people didn't have an RT capable card and a further 41% didn't even bother enabling it. This is among the enthusiast community, the numbers are going to be much more against RT in the broader PC gaming population.
 
It depends on what you are looking for a GPU to be able perform well on. Most gamers today are on the ray tracing bandwagon and the Nvidia GPUs are superior to the AMD GPUs in that respect.
If by superior, you mean slightly less shit, but still pretty much unusable, then you're right. Nothing except for the 4090 can ray trace properly, so arguing who has the upper hand is pointless media anger.
 
Most gamers today are on the ray tracing bandwagon
You cannot make this claim without numbers to back it up.

who's are most gamers? You mean the mobile games? the esports gamers?
 
Ray Tracing is definately still a "niche" area in gaming. Just have a quick gander at the steam charts out of the top 9 nVidia GPUs on the charts, 2 at most have a decent chance at regularly running ray tracing of any form regularly(3070 and 4070), thats the 9th and 5th place card. 2 of them cannot even run ray tracing of any kind without it turning into powerpoint simulator 2024 (1650 and 1060). and the rest of them would be extremly crippled in performance by enabling it.

1714430296460.png


So out of ~34% of gamers surveyed 6.5% have a good chance of regularly running Ray Tracing. Still not amazing market penetration for a tech that is now nearly 6 years old.
Now I will admit its only in the 40xx and upcoming 50xx series that we have realllly had the performance in RT to turn it on and not be like what major downgrades do I need to do to the rest of the settings to make it work regularly in AAA games.
I would argue that most of the people that have the capability turn it on do it to see what it looks like but then either turns it right down or off to not impact performance.

Also all the hype around DLSS/FSR/XeSS etc is just a complete crutch for game developers to skip optimising their games to the same level as before/remove the need for optimisation tweaks in patchs so they can focus on DLC/Skins/Microtransactions from Day 0. Like nVidia coming out with marketing materials going "LOOK AT THIS 5X PERFORMANCE UPLIFT GEN TO GEN" and then in 0.5 font "When using DLSS 3.5"
Its like a car manufactuer coming out and saying "We have now released a 155mph capable car while maintaining 100+ MPG on your daily trips" and then you look at the writing to notice its while traversing the side of mount everest for the gravity assist or some other bullshit
 
Also all the hype around DLSS/FSR/XeSS etc is just a complete crutch for game developers to skip optimising their games to the same level as before/remove the need for optimisation tweaks in patchs so they can focus on DLC/Skins/Microtransactions from Day 0. Like nVidia coming out with marketing materials going "LOOK AT THIS 5X PERFORMANCE UPLIFT GEN TO GEN" and then in 0.5 font "When using DLSS 3.5"
Its like a car manufactuer coming out and saying "We have now released a 155mph capable car while maintaining 100+ MPG on your daily trips" and then you look at the writing to notice its while traversing the side of mount everest for the gravity assist or some other bullshit
Yeah I think they managed the speed and mileage by pushing it out the back of a C-130 @ 10000 feet.

You're not wrong about it being niche, of the people I've helped build systems for in the last 12 months, I've had to explain ray tracing to them in about 2/3 of the cases.

I really want to know what AMD (and this forum) thinks will tempt me to 'upgrade' from 6800XT.
 
I really want to know what AMD (and this forum) thinks will tempt me to 'upgrade' from 6800XT.
I think there's zero reason to upgrade from a 6800 XT. ;)

The only reason I'm thinking about replacing my 7800 XT is that it eats almost as much power playing a video as my CPU does under full load. It's performance is more than enough and I don't care about RT or DLSS.
 
I think there's zero reason to upgrade from a 6800 XT. ;)

The only reason I'm thinking about replacing my 7800 XT is that it eats almost as much power playing a video as my CPU does under full load. It's performance is more than enough and I don't care about RT or DLSS.
Same boat re DLSS/RT.

I've said this before but if it isn't 7900XTX'ish performance with better power consumption and for a healthy price I wont be interested. Scarily could even consider moving to team green, which I'd prefer not to do.
 
Not the exact polls I was referencing but I was able to pull up some that I've voted on in the past via my history:

View attachment 345615
View attachment 345616

View attachment 345617


According to the GN poll 31% of people didn't have an RT capable card and a further 41% didn't even bother enabling it. This is among the enthusiast community, the numbers are going to be much more against RT in the broader PC gaming population.
Ironically, the moment RT works best is when the performance tax is low enough for people to just 'turn it on' and because it ubiquitously looks better in tandem with that performance cost. Neither is the case today. Yes, sometimes RT makes a marked difference. But it is highly scene dependant, to the point of which its not much different from looking at hand-crafted scenes with well placed rasterized/dynamic lighting. RT doesn't do much that is truly new or couldn't be done already - if used sparingly. Now you can splash RT all over the scene and still waste tons of resources but the specific hardware makes it somewhat manageable.

We've still got a long way to go, and brute forcing the whole scene like they're doing now, isn't the way, the real deal will come from engine developments like Nanite that are software based before hardware specific. In the end, hand crafting will still matter not much unlike the way it worked before. Lots of new games on new engines don't look all that great. There's no love in it, just high fidelity assets. That alone won't make the scene or the game. Evidently, graphics don't make the game.
 
Ironically, the moment RT works best is when the performance tax is low enough for people to just 'turn it on' and because it ubiquitously looks better in tandem with that performance cost. Neither is the case today. Yes, sometimes RT makes a marked difference. But it is highly scene dependant, to the point of which its not much different from looking at hand-crafted scenes with well placed rasterized/dynamic lighting. RT doesn't do much that is truly new or couldn't be done already - if used sparingly. Now you can splash RT all over the scene and still waste tons of resources but the specific hardware makes it somewhat manageable.

We've still got a long way to go, and brute forcing the whole scene like they're doing now, isn't the way, the real deal will come from engine developments like Nanite that are software based before hardware specific. In the end, hand crafting will still matter not much unlike the way it worked before. Lots of new games on new engines don't look all that great. There's no love in it, just high fidelity assets. That alone won't make the scene or the game. Evidently, graphics don't make the game.

Most RT implementations aren’t path tracing, which is still too demanding for current hardware. This results in most RT games being gimmick settings while murdering your performance and providing little to no visual benefit; there are a select handful of games where the visual impact is actually enough to justify the performance hit - absolutely a niche.

Ray Traced effects and RTX are almost as bad as the PhysX nonsense. Modern hardware and consoles alike simply aren’t powerful enough for real “ray” (path) tracing.
 
Most RT implementations aren’t path tracing, which is still too demanding for current hardware. This results in most RT games being gimmick settings while murdering your performance and providing little to no visual benefit; there are a select handful of games where the visual impact is actually enough to justify the performance hit - absolutely a niche.

Ray Traced effects and RTX are almost as bad as the PhysX nonsense. Modern hardware and consoles alike simply aren’t powerful enough for real “ray” (path) tracing.

I'm currently playing Hogwarts Legacy. Runs really well without RT, no stutter or big drops outside of a few cutscenes. 4K60 DLSS Quality with high settings.
I tried RT AO and reflections. Had to drop to DLSS Performance, but even with that the game was hitching a lot. Doesn't seem VRAM related, as the usage was between 10-11 GB. Could be CPU related, but that's still just a bad implementation (all CPUs have terrible 1% lows in this game with RT).

RT reflections look good, I love that they don't disappear. But RTAO doesn't look much better, there are still some disocclusion artifacts. It's absolutely not worth the hit to performance.

RT is pretty pretty much pointless on anything but the top end GPU. I can't imagine when the 5090 comes out and they start "optimizing" games for that. But it is kind of like future-proofing. When you play the game 5 or 10 years later, you can enable all the eye candy.
But I would still rather get more shaders instead of RT cores. I do want DLSS, though, extremely useful on a 4K TV.
 
I'm currently playing Hogwarts Legacy. Runs really well without RT, no stutter or big drops outside of a few cutscenes. 4K60 DLSS Quality with high settings.
I tried RT AO and reflections. Had to drop to DLSS Performance, but even with that the game was hitching a lot. Doesn't seem VRAM related, as the usage was between 10-11 GB. Could be CPU related, but that's still just a bad implementation (all CPUs have terrible 1% lows in this game with RT).

RT reflections look good, I love that they don't disappear. But RTAO doesn't look much better, there are still some disocclusion artifacts. It's absolutely not worth the hit to performance.

RT is pretty pretty much pointless on anything but the top end GPU. I can't imagine when the 5090 comes out and they start "optimizing" games for that. But it is kind of like future-proofing. When you play the game 5 or 10 years later, you can enable all the eye candy.
But I would still rather get more shaders instead of RT cores. I do want DLSS, though, extremely useful on a 4K TV.
I'm playing Hogwarts Legacy, too. I haven't encountered any hitching with RT on, but it tanks performance while providing no visual upgrade that I could notice. If I have to enable FSR to be able to play with a few extra effects that I can't notice anyway, then I say no thanks.
 
I'm playing Hogwarts Legacy, too. I haven't encountered any hitching with RT on, but it tanks performance while providing no visual upgrade that I could notice. If I have to enable FSR to be able to play with a few extra effects that I can't notice anyway, then I say no thanks.

7000X3D seem to be the only CPUs that can run this game with RT, so I'm not surprised you don't get hitching. ;)

photo_2024-05-02_21-28-05.jpg
 
Last edited:
Back
Top