• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD WMMA Instruction is Direct Response to NVIDIA Tensor Cores

Purely in terms of resale value yes it's outdone probably any other dGPUs in the past, but then you forgot the backdrop? A one in 100 year global pandemic. As for your particular point about Tensor cores, correct me if I'm wrong, outside of DLSS are they actually that useful anywhere else? The way things are shaping up right now DLSS vs FSR will end up almost exactly as Gsync vs Freesync!

Tensor cores: DLDSR, Nvidia broadcast, Nvidia Canvas, RT denoise (or even RT upscaling)

Well AMD is adding instructions for ML, could that be for FSR3.0 I wonder :roll:
 
Adding Xilinx IP to their CPU/GPU is it? That's the next logical step from their acquisition. They aren't necessarily competing with Nvidia here more like a counter to Intel.
 
Ray tracing is a joke anyway, no one is missing much for not having it. It's true we hadn't much AAA releases, still after so long we have but a few games that done anything meaninfull with it. Seems more like a must have buzzword for the box than anything else

Cause the hardware is not there on consoles and won't be at least until ms and sony get their mid-cycle refreshes out. The games where devs clearly put effort into it look amazing and if you don't see it - well, I guess it's pointless to even try and reason here, lol.
 
Cause the hardware is not there on consoles and won't be at least until ms and sony get their mid-cycle refreshes out. The games where devs clearly put effort into it look amazing and if you don't see it - well, I guess it's pointless to even try and reason here, lol.

i have a rtx card as you can see, i played metro, cp77 and control, it looks amazing especially in control, but that doesn't take away that there are very few games 4 years later, and even fewer that did anything meaninfull with it, tom raider is nothing for example, just marketing
 
i have a rtx card as you can see, i played metro, cp77 and control, it looks amazing especially in control, but that doesn't take away that there are very few games 4 years later, and even fewer that did anything meaninfull with it, tom raider is nothing for example, just marketing
Look, you said that the tech is joke - I disagree, I think the tech is amazing and I'd much rather have it than not. I would absolutely consider an AMD card in a few months if they have it vastly improved and conversely I'm 100% not bothered about what they have to offer if it's lagging by 40-50% again. Even if it's just another 5 big games in two years - I wanna see these games at their best, no compromises. I'm pretty sure the market would agree too - unless AMD are content to forever be around 20% they'd better bring parity in features and dare not return to their polaris glory days of black screens on every other driver update.
 
It is not a joke because it makes the nvidia cards flying off the shelves - the nvidia users claim it's better because:
1. Ray-tracing;
2. DLSS;
3. CUDA, tensor, etc...
I have yet to know any Nvidia users who bought their card because of RT. Or even DLSS for that matter.
I do know some that use CUDA tho but they are not gamers.
I bought my RTX card because of performance and price. I have tried RT a few times but have been underwhelmed by how little it adds and how much it tanks performance for the little it offers. None of the games i play regularly even support these features.

Nvidia cards are bought because of mindshare and performance mostly. It used to be perf per W but that's gone now. It has little to do with features.
 
just a very wild theory; someone will make a shitcoin that uses "specialized cores" on a GPU for mining. Not that it might happen, but something to consider, knowing how desperate people can get.
 
Love the mental gymnastics on display all over the web regarding this announcement.

The popcorn tastes better than ever.
 
just a very wild theory; someone will make a shitcoin that uses "specialized cores" on a GPU for mining. Not that it might happen, but something to consider, knowing how desperate people can get.
God, I hope not. The crypto-bros need to go DIAF already.
 
Nvidia cards are more "popular" than AMD among the general populace that's just a fact, whether rightly or wrongly that's a separate debate!

Also AMD was MIA from the top end GPU space from Polaris till RDNA2 basically! So anyone wanting 2070/80 tier performance either had to go Nvidia or wait 3(4?) years.
 
Sell the card and buy a 16 GB Radeon RX 6800 XT or a 12 GB Radeon RX 6700 XT.

Why did you buy it in the first place?

Just because someone has a RTX card, it doesn't mean that they're making use off everything it can do. Why do you care if someone has a Nvidia or AMD card? Who cares what they do with them....use DLSS or FSR.....enable RT or not in games.....you're not using the card, why do you care what someone else does with it so they can enjoy playing games?

Suggesting to sell the card, he'd maybe get $500 used for it and to buy a 6700XT that's about the same performance (maybe slightly behind), but have to spend upwards of $550 (or more) for one is stupid. Even more dumb suggesting he'd buy a 6800XT that can cost upwards of $850.

I can honestly say the only game I've played on my 3080 that allows use of DLSS and RTRT was Metro Exodus and after using both on and off, I wasn't impressed by the RT or DLSS. Does that mean I won't them down the road on other games? Who knows, I may or may not.

I personally haven't been impressed with RT from either side since it came out last gen. Do I feel bad for AMD that they're so far behind what Nvidia can do or do I feel bad for Nvidia for having dedicated Tensor cores to handle RT and they still suck at it?

Maybe next gen, but most likely the following, AMD and Nvidia should be able to handle RT without tanking performance like hell or requiring the use of DLSS/FSR to make up for the loss of performance. As it stands, for me, RT is just a gimmick for a selling point. I also see DLSS and FSR having been designed to keep performance (FPS) up since hardware hasn't been able to keep up with advances in games.....that, or they're designed to help offset the poor coding from devs that launch games that run like crap.
 
Maybe next gen, but most likely the following, AMD and Nvidia should be able to handle RT without tanking performance like hell or requiring the use of DLSS/FSR to make up for the loss of performance
I doubt that, they'll just add more "Jiga" rays in there! RT is like the arms race we see in high resolution (gaming) or virtually any high end computing realm.
 
Just because someone has a RTX card, it doesn't mean that they're making use off everything it can do. Why do you care if someone has a Nvidia or AMD card? Who cares what they do with them....use DLSS or FSR.....enable RT or not in games.....you're not using the card, why do you care what someone else does with it so they can enjoy playing games?

Suggesting to sell the card, he'd maybe get $500 used for it and to buy a 6700XT that's about the same performance (maybe slightly behind), but have to spend upwards of $550 (or more) for one is stupid. Even more dumb suggesting he'd buy a 6800XT that can cost upwards of $850.

I can honestly say the only game I've played on my 3080 that allows use of DLSS and RTRT was Metro Exodus and after using both on and off, I wasn't impressed by the RT or DLSS. Does that mean I won't them down the road on other games? Who knows, I may or may not.

I personally haven't been impressed with RT from either side since it came out last gen. Do I feel bad for AMD that they're so far behind what Nvidia can do or do I feel bad for Nvidia for having dedicated Tensor cores to handle RT and they still suck at it?

Maybe next gen, but most likely the following, AMD and Nvidia should be able to handle RT without tanking performance like hell or requiring the use of DLSS/FSR to make up for the loss of performance. As it stands, for me, RT is just a gimmick for a selling point. I also see DLSS and FSR having been designed to keep performance (FPS) up since hardware hasn't been able to keep up with advances in games.....that, or they're designed to help offset the poor coding from devs that launch games that run like crap.

It's always opposition here.
When someone here asks what to buy, the nvidia users jump immediately and begin listing those mystical "features" and justifications why the someone has to buy an nvidia card and nothing else.

Here, I am doing the opposite - trying to figure out what the response will be if I said the same, and people like you jump on my throat claiming it wasn't that.

I don't know if it's simple trolling or weird forum psychology but it's ugly and not fair.
 
I wonder if Xilinx IP will be used with this. Use those cores for Raytracing when needed, repurpose them for something else when not.
 
So, when do we get a software compatibility layer for DLSS from AMD? Because, if everyone supports it, then it becomes an industry standard (Much like NVIDIA graced us with "Gsync-compatible" and suddenly the entire world started adding Freesync.)

Otherwisde, this will end-up being another closed system that nobody will use.
 
Last edited:
I don't know if it's simple trolling or weird forum psychology but it's ugly and not fair.
It happens all over the web and definitely on both sides, people desperately trying to steer buyers either towards or away from what they deem better or worse. The justification and cherry picking of desirable specs, features, and product nuances, and of course company ethics etc.
 
Just don't play games then, wait until you are on your deathbed, I'm sure games will look amazing then
Fortunately games are for fun not graphics.
What is unfortunate is some games look awful (Control) yet they run poorly, seems to a waste of resources.
Example below, it looks pathetic, what's the point of implementing realistic lighting when the mug looks so comically unrealistic. First make the mug looks like a real mug then and only then add the pointless RT (pointless till reasonably priced GPUs can handle RT with no effort like tesselation before).
 

Attachments

  • IMG_20220630_231233.jpg
    IMG_20220630_231233.jpg
    526.9 KB · Views: 71
I have yet to know any Nvidia users who bought their card because of RT. Or even DLSS for that matter.
I do know some that use CUDA tho but they are not gamers.
I bought my RTX card because of performance and price. I have tried RT a few times but have been underwhelmed by how little it adds and how much it tanks performance for the little it offers. None of the games i play regularly even support these features.

Nvidia cards are bought because of mindshare and performance mostly. It used to be perf per W but that's gone now. It has little to do with features.
Personally I bought a 3060 due to CUDA and DLSS, and cheapest way to access 12GB VRAM for production workload.

IMO DLSS is a valid selling point, espacially to ppl playing at lower resolution, at 4K FSR works just as well as DLSS, even in ultra performance, but at 1080P FSR 2.0 is simply not usable at all.
 
Personally I bought a 3060 due to CUDA and DLSS, and cheapest way to access 12GB VRAM for production workload.

IMO DLSS is a valid selling point, espacially to ppl playing at lower resolution, at 4K FSR works just as well as DLSS, even in ultra performance, but at 1080P FSR 2.0 is simply not usable at all.
Why would someone use DLSS at 1080p. Or FSR. It would have to be a very weak card.
 
It's always opposition here.
When someone here asks what to buy, the nvidia users jump immediately and begin listing those mystical "features" and justifications why the someone has to buy an nvidia card and nothing else.

Here, I am doing the opposite - trying to figure out what the response will be if I said the same, and people like you jump on my throat claiming it wasn't that.

I don't know if it's simple trolling or weird forum psychology but it's ugly and not fair.
Still, it shouldn't matter to anyone why he has a 3070.

Maybe he won it in a contest.
Maybe he stole it.
Maybe it was priced a whole helluva a lot better than the 6700XT was when he got it (for me I could have purchased a 3070 for around $850 at the height of the high prices whereas the 6700XT at the same time were over $1k. The 6800XT were around $1500. Even if I wanted something that performed around a 6800XT level I'd have gone with a RTX 3080 because they were only around $1100-1200, much cheaper than the 6800XT)
Maybe someone gave it to him as a gift.
Maybe he's always used Nvidia and kept with them.
Maybe he's always used AMD/ATI before and wanted to see what it was like to use Nvidia.

You were the one that gave bad advice to sell it and pick up a 6700XT or 6800XT - which would cost him more over what he would sell his 3070 for. Expect to be criticized for poor comments.
 
Tensor cores: DLDSR, Nvidia broadcast, Nvidia Canvas, RT denoise (or even RT upscaling)

Well AMD is adding instructions for ML, could that be for FSR3.0 I wonder :roll:
Most people don’t use that stuff. Most people just buy GPUs to plainly game on them, believe it or not, they don’t even know what DLSS is, way less so dldsr some gimmick not even I’m using. RT denoise is nothing special, you sound like a Nvidia fan. Nvidia broadcast is remotely interesting to content creators but I’m a content creator and I know it’s subpar trash, so I’m not using it. Here you go. Easily debunked.
Just because someone has a RTX card, it doesn't mean that they're making use off everything it can do. Why do you care if someone has a Nvidia or AMD card? Who cares what they do with them....use DLSS or FSR.....enable RT or not in games.....you're not using the card, why do you care what someone else does with it so they can enjoy playing games?

Suggesting to sell the card, he'd maybe get $500 used for it and to buy a 6700XT that's about the same performance (maybe slightly behind), but have to spend upwards of $550 (or more) for one is stupid. Even more dumb suggesting he'd buy a 6800XT that can cost upwards of $850.

I can honestly say the only game I've played on my 3080 that allows use of DLSS and RTRT was Metro Exodus and after using both on and off, I wasn't impressed by the RT or DLSS. Does that mean I won't them down the road on other games? Who knows, I may or may not.

I personally haven't been impressed with RT from either side since it came out last gen. Do I feel bad for AMD that they're so far behind what Nvidia can do or do I feel bad for Nvidia for having dedicated Tensor cores to handle RT and they still suck at it?

Maybe next gen, but most likely the following, AMD and Nvidia should be able to handle RT without tanking performance like hell or requiring the use of DLSS/FSR to make up for the loss of performance. As it stands, for me, RT is just a gimmick for a selling point. I also see DLSS and FSR having been designed to keep performance (FPS) up since hardware hasn't been able to keep up with advances in games.....that, or they're designed to help offset the poor coding from devs that launch games that run like crap.
4K was the last big frontier and after 4K it’s not even worth anymore increasing resolution unless you also increase screen size which didn’t happen. The next big frontier is hence RT, and once RT hardware reaches a certain point, all games will have more and more impressive RT until everything is ray traced and photo realistic. This will of course still take considerable time.
It's always opposition here.
When someone here asks what to buy, the nvidia users jump immediately and begin listing those mystical "features" and justifications why the someone has to buy an nvidia card and nothing else.

Here, I am doing the opposite - trying to figure out what the response will be if I said the same, and people like you jump on my throat claiming it wasn't that.

I don't know if it's simple trolling or weird forum psychology but it's ugly and not fair.
Nvidia was for so long so far ahead and with better drivers that the mind share is simply strong but I see even with enthusiasts that AMD picked up a lot of goodwill with Radeon, CPUs were always popular but Radeon is popular somewhat too now.
So, when do we get a software compatibility layer for DLSS from AMD? Because, if everyone supports it, then it becomes an industry standard (Much like NVIDIA graced us with "Gsync-compatible" and suddenly the entire world started adding Freesync.)

Otherwisde, this will end-up being another closed system that nobody will use.
That’s nonsense, DLSS is competitive with FSR, there will be no compatibility layer and DLSS isn’t a industry standard either, it’s just a proprietary upscaling method. DLSS is exactly like Gsync, and that’s why FSR is picking up so much good will from the get go compared to it, even FSR 1.0 was very popular. And because it runs on all hardware it will probably replace DLSS as the upscaling king on the long run. Same as with FreeSync. Freedom is simply better, non-proprietary is king.
 
I don't know if there is going to be a dedicated tensor like unit in their RDNA3 CUs (just like CDNA2) or not, but in the below case, the supported input formats/functions & matrix layouts are much less than their matrix implementation in CDNA2 or vs Nvidia's implementation.
A logical assumption is that AMD's implementation will target area optimized solution for matrix calculations (just like they usually do for nearly everything not contributing to classic raster performance, raytracing included) with the actual performance level potential in relation with the competition being a nearly secondary consideration.
Also irrespective of performance, it seems (based on the below) that the supported formats/layouts being less than the competition leading potentially to less use cases also. (But first Nvidia must use their tensor cores for more cases other DLSS's implementation in order to even matter, maybe it's time with Ada)

AMDGPUMFMA (CDNA 2)WMMA (GFX11)
Register typeAccVGPRVGPR
Input formatF32, F16, I8, BF16, F64F16, BF16, IU8, IU4
Output formatF32, I32, F64F32, F16, BF16, I32
Matrix layout{4,16,32}x{4,16,32}x{2,4,8,16}16x16x16

Regarding FSR 2.0, it isn't equivalent to DLSS 2.X but it's close in higher resolutions in most cases and most of the implementations are on games that had DLSS, so the experience that developers had implementing/use motion vectors etc really helped the process/results.
The problem is that FSR 2.0 uses fixed algorithms while DLSS uses machine learning so the gap is probably going to widen as AI evolves and gets better through the years, so certainly FSR 2.0 isn't the DLSS replacement that AMD fans want, an open source AI implementation some day maybe yes, although training isn't free.
 
Last edited:
Back
Top