• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

the nvidia vs amd argument is irrelevant
the argument is AMD Throwing in the towel when they can't hack it
A company basically admitting will we aren't as good as your competitors so we aren't going to to even try
yea thats gonna go over really well
 
Last edited:
gamers don't matter
ok
I can't really make that any clearer
gamers are a insignificant part of the market

This article is clearly not for you then, it is about a gaming focused gpu, not the instincts, but those aren't for you either because you are obviously just a fanboi and not in industry.
Also... RDNA3 already has tensor cores... AMD just calls them WMMA Matrix cores... They will continue to add feature sets to them... What Wang said was...

He thinks that FSR and DLSS is a waste of the matrix math engines these gpus have... when they could be used on smarter NPCs and game enhancing features... not just as a means to fix poor game optimizations. But evidently reading is hard.

Since you are unawares...
AMD added matrix cores to CDNA 1.0 MI100, enhanced them for CNDA 2 MI210/250x And RDNA3 got them and its unclear if they are enhanced past CNDA2 as CDNA3 is already in testing.
AMD has added a fpga xilinx core to the Zen4 laptop line for ai inferencing... and it is fairly clear they will continue to add and support an accelerated future. This article was not about a lack of AI support but in using it for... enhancement, not as a replacement for proper game design and optimization.
 
Last edited:
Of course, the 7900xt was very reasonable, getting absolutely destroyed by the 4070ti...
Can't see the problem. Here in Australia, the 7900XT seems to be cheaper than the 4070ti, and as far as TPU's own reviews of the two cards is concerned, I can't see the 7900XT being destroyed by the 4070ti anywhere - in fact, the average gaming framerate chart at the end shows the 7900XT above the overclocked 4070ti at every resolution (raster)...

...unless you meant RT, or video compression, or DLSS, or something else - but you didn't say any of that.
 
Who really cares, they're both right and wrong, besides upscaling the ML hardware accelerators really are worthless for the consumer space, at the same time they wont be used for anything else any time soon.

Gamers aren't the core drivers of the market anymore.
gamers are a insignificant part of the market

You're both beyond utterly wrong though, over 3 billion in revenue is not insignificant by any stretch of the imagination.
1676881777543.png



They've always made a huge chunk of their money from consumer products, sadly for Nvidia marketing and sponsorship deals don't work very well outside of the consumer market. You can't buy your way to success as easily and actually have to provide extremely competitive pricing because ROI is critical to businesses as opposed to regular consumers so you can't just price everything to infinity and provide shit value.
 
Agreed

When things became more and more demanding
Nvidia will have to allocate more and more die space for AI dedicated processing units.
Soon it will reach a critical point, where there is too much 'dead weight' and it is no longer cost effective to do it.

and Nvidia themselves will have to find a way to integrate AI processing back to the "Normal" stream processors.
So the cycle begins again
This.

Nvidia is desperately looking to repurpose its fancy cores because all that die space is sitting there. They use it for DLSS, while AMD achieves near-similar results without using that die space for FSR. We can talk for ages about whether each pixel is arranged favorably on a per-game basis, but if you zoom out a little bit, that is, into an ergonomic seating position, you can't tell the difference proper.

Nvidia is pushing RT but AMD is carrying it just fine, again without special cores. AMD's approach is clearly superior when you see the perf gap relatively shrink between RDNA2 and 3, and between AMD and Nvidia. We're building larger GPUs anyway, and its a major step back to have parts of the die sit there not being used. We figured this out already decades ago... Nvidia's move since Turing was, is and will continue to be a competitive regression - not progress. They're using die space for single purpose nonsense that barely pays off. We're paying that die space bigtime.

As long as AMD controls the console space, and as the only capable custom APU with decent graphics, that's what they'll keep, they can dictate the progress of RT and gaming in general because shit just has to run on their hardware. And they're making it happen as we speak. You cán use RT on AMD. You cán use FSR on any GPU. The technologies work and pay off just the same.

Nvidia is diving into a hole, and this is a long journey. Let's see where it ends. I don't think AMD is using a bad strategy here, nor a loser's strategy. Chasing the most popular soundbites isn't by definition the best way forward. And if you combine this with the general economical situation... wasting cores and valuable die space on special purpose makes even less sense.
 
Can't see the problem. Here in Australia, the 7900XT seems to be cheaper than the 4070ti, and as far as TPU's own reviews of the two cards is concerned, I can't see the 7900XT being destroyed by the 4070ti anywhere - in fact, the average gaming framerate chart at the end shows the 7900XT above the overclocked 4070ti at every resolution (raster)...

...unless you meant RT, or video compression, or DLSS, or something else - but you didn't say any of that.
Nowadays yes, they have similar price, amd dropped the price a couple of weeks ago. Hogwarts benchmark, latest patch from pcgh, the 7900xt is losing to a 3080 while consuming 30% more power than the 4070ti.

zrDgZAa.png
 
Thing is, NV can quickly and easily catch up with AMD for any non-visual AI implementation but AMD will never get close to NV in the AI visual department.
AMD perpetuate the current state by choosing this route.
 
let me put this to you

What if I where to combine AI Art Generation with ChatGPTs Natural Lanuage Interface with Something like Unreal Engine 5 (we really are not far away from this at all all the pieces exist it just takes somebody to bring it all togetor )

what if you could generate entire envroments just by telling a AI to "show me the bridge of the enterprise"
if you can't see the potental and the way the winds are shifting you may our soon to exist Ai-god have mercy on your fleshy soul
Then you would have a hype and then it would die down as people slowly figured out this is nothing other than what procedural generation has been doing in No Man's Sky. We're humans and we love to figure out patterns. The idea is that AI is too complex to show us that so we'll consider it more 'sentient'. But then we ask 15 consecutive questions and memory runs out and it starts throwing gibberish out.

Come on bro. Our 'AI' technology is input > output and lots of training. Its tunnel vision waiting to happen, and we've seen it all before. Its just a fancy slightly less random RNG. These things are full of paradoxical stuff, and we're discovering that as we speak...err chat.

In gaming what is the big constant? Good design, talented developers make great games when given the tools. The tools are just tools. If you haven't got brilliance at the helm, you'll get thirteen in a dozen crap and we have a well fleshed out history of said games and differences. AI won't change that at all. It'll only lower the bar for more bottom feeding bullshit.
 
the nvidia vs amd argument is irrelevant
the argument is AMD Throwing in the towel when they can't hack it
A company basically admitting will we aren't as good as your competitors so we aren't going to to even try
yea thats gonna go over really well

What do you expect! Nvidia's mindshare is just too strong & AMD can't get away from their "bad driver" stigma no matter how much they improve them.
 
What do you expect! Nvidia's mindshare is just too strong & AMD can't get away from their "bad driver" stigma no matter how much they improve them.
Definitely it's the stigma. Not like current top end amd cards losing to a 3080 in hogwarts latest patch that is the issue
 
Last edited by a moderator:
Of course, the 7900xt was very reasonable, getting absolutely destroyed by the 4070ti. And their cpus? Oh those are insanely reasonable, they priced 6 cores at 300€ msrp when their competition asks 350 for 14cores that smack it around the room.
7900 XT beats 4070 Ti by 10% at 1440p and has 20GB with 800 GB/s - 4070 Ti has 12GB with 504 GB/s.

And Intel uses big.LITTLE, meaning tons of efficiency cores while AMD uses performance cores only, so you can stop compairing cores like that. Ryzen 7800X3D will probably smack i9-13900KS in gaming when released in a few weeks.
 
Definitely it's the stigma. Not like current top end amd cards losing to a 3080 in hogwarts latest patch that is the issue

zrDgZAa.png
Until you get to hogsmeade and then

Hogsmeade_RT_1080p-color.png


Then at 1080p the 3080 chokes and can't even keep up with the 3060 that atleast manages to hit 30 fps.

At 4K nothing manages to hit a nice 60 fps. Even the 4090 suffers a fair number of drops below 50 but atleast that could work, everything else is just too slow unless you want to turn on DLSS/FSR or accept 30fps.

Hogsmeade_RT_2160p-color.png
 
Until you get to hogsmeade and then

Hogsmeade_RT_1080p-color.png


Then at 1080p the 3080 chokes and can't even keep up with the 3060 that atleast manages to hit 30 fps.

At 4K nothing manages to hit a nice 60 fps. Even the 4090 suffers a fair number of drops below 50 but atleast that could work, everything else is just too slow unless you want to turn on DLSS/FSR or accept 30fps.

Hogsmeade_RT_2160p-color.png
That's the old version before the patch. Pcgh numbers are from 2 days ago with the latest patch
 
Until you get to hogsmeade and then

Hogsmeade_RT_1080p-color.png


Then at 1080p the 3080 chokes and can't even keep up with the 3060 that atleast manages to hit 30 fps.

At 4K nothing manages to hit a nice 60 fps. Even the 4090 suffers a fair number of drops below 50 but atleast that could work, everything else is just too slow unless you want to turn on DLSS/FSR or accept 30fps.

Hogsmeade_RT_2160p-color.png
Conclusion, the game is an absolute joke performing like that for the way it looks. We shouldn't care.

Or...we can feed the troll here just a bit more with random charts from random moments in time.
 
7900 XT beats 4070 Ti by 10% at 1440p and has 20GB with 800 GB/s - 4070 Ti has 12GB with 504 GB/s.

And Intel uses big.LITTLE, meaning tons of efficiency cores while AMD uses performance cores only, so you can stop compairing cores like that. Ryzen 7800X3D will probably smack i9-13900KS in gaming when released in a few weeks.
Beats it by 10% but until a few days ago it was 14% more expensive, while massively losing in rt and consuming 30% more power. Great deal my man
 
That's the old version before the patch. Pcgh numbers are from 2 days ago with the latest patch

PCGH did not test in hogsmeade village so until that test is done with the new patch and new data is generated there is no evidence that it is entirely fixed in that scenario.
 
No, gaming is still Nvidia's top earning segment. To go as far as the poster you are quoting to say that there are irrelevant is laughably incorrect. 50% of your revenue is not remotely insignificant.
The real question is how much of that is actually made of pure gamers rather than professionals that don't need a Quadro (people like Beeple, Ash thorp, Vitaly Bulgarov, made millions with their computers and worked on blockbusters without Quadros). I've seen people working in the creative industry saying that stuff like a 3090/4090/4080 are so expensive because they are targeting people who can actually justify it as a work expense rather than leisure.
 
PCGH did not test in hogsmeade village so until that test is done with the new patch and new data is generated there is no evidence that it is entirely fixed in that scenario.
How about tpups numbers? Let's ignore those as well right?
 
This one game example side quest has no end and no purpose in this thread..

AMD will need to make up any kind of dlss3 answer, even if not frame generation related, to make up for the preformance gap.
I don't see "procedural world generation, NPCs, bot AI" as the write say helping doing so.
 
This thread is looking like troll country.
I agree, if you don't gobble up and fully support everything AMD does, then you're a troll
 
How about tpups numbers? Let's ignore those as well right?

It shows clearly the game is not really ready to be a staple in a benchmark suite, different testing areas have such varied results there is no singular correct conclusion that can be drawn.

Trying to use it in part of an argument given these obvious and observable short comings suggests you have a disingenuous agenda.
 
Nowadays yes, they have similar price, amd dropped the price a couple of weeks ago. Hogwarts benchmark, latest patch from pcgh, the 7900xt is losing to a 3080 while consuming 30% more power than the 4070ti.
Here's a different game where the 4070ti get annihilated even by Nvidia's own older generation cards let alone a 7900XT, so what's your point ? Though I am sure you're going to find some nonsensical explanation as to how this chart is less relevant than yours because I don't know, fanboy reasons, I guess.

1676887697114.png
 
Here's a different game where the 4070ti get annihilated even by Nvidia's own older generation cards let alone a 7900XT, so what's your point ? Though I am sure you're going to find some nonsensical explanation as to how this chart is less relevant than yours because I don't know, fanboy reasons, I guess.

View attachment 284687
The problem is, the 7900xt should be beating the 4070Ti across the board, since not only was it 15% more expensive, it also consumed a truckload more power. So that's normal. Losing to a 4070ti on the other hand isn't really normal, and it happens across the board in the majority of rt games
 
The problem is, the 7900xt should be beating the 4070Ti across the board, since not only was it 15% more expensive, it also consumed a truckload more power. So that's normal. Losing to a 4070ti on the other hand isn't really normal, and it happens across the board in the majority of rt games
And it doesn't? You call for one game which is brand new and it is hard to determine the overall performance for any given card since it jumps when you are in a different place in the game.
As far as I know, across the board, 7900xtx is way faster than a 4070Ti.
About RT. heh, it is sad you bring that up since not even 4090 can pull this RT off in certain games. having 50FPS or 20FPS does not make a difference really. Nice to have RT but rely solely on that is fools errand.
Just like pursuing some sort of AI scheme that it supposedly is better for gaming. AI is not for gaming but it does make huge strides in other areas but not gaming.
 
And it doesn't? You call for one game which is brand new and it is hard to determine the overall performance for any given card since it jumps when you are in a different place in the game.
As far as I know, across the board, 7900xtx is way faster than a 4070Ti.
About RT. heh, it is sad you bring that up since not even 4090 can pull this RT off in certain games. having 50FPS or 20FPS does not make a difference really. Nice to have RT but rely solely on that is fools errand.
Just like pursuing some sort of AI scheme that it supposedly is better for gaming. AI is not for gaming but it does make huge strides in other areas but not gaming.
Well it doesn't really matter whether you think it's worth it or not. Why is it okay for the 7900xt not only to be 15% more expensive at launch, not only consuming 30% more power, but also getting it's ass handed to it in rt?
 
Back
Top