• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD WMMA Instruction is Direct Response to NVIDIA Tensor Cores

That’s nonsense, DLSS is competitive with FSR, there will be no compatibility layer and DLSS isn’t a industry standard either, it’s just a proprietary upscaling method. DLSS is exactly like Gsync, and that’s why FSR is picking up so much good will from the get go compared to it, even FSR 1.0 was very popular. And because it runs on all hardware it will probably replace DLSS as the upscaling king on the long run. Same as with FreeSync. Freedom is simply better, non-proprietary is king.

FSR looks good in a most of its launch titles - since then, its been handed-off to "insert random summer intern here" , with varying levels of crappy results!

DLSS is at least more consistent from one game to the next ! You cant somehow pretend that just because AMD's solution is open-source, that the devs care not going to rewrite it for every game - that's why we have over TWO DOZEN actively-maintained windowing environments on Linux - plus several times as many package management systems as anywhere else!

So, do you really trust some random game intern to figure thing out on his own every time, orf do you expect "yet another mess" nearly every time they get te freedom? For every Proton managed by Valve, there's two dozen inferior copies of the same upscaler - until AMD takes the same level of responsibility thy did for their initial launch tittles, the mess will continue
 
FSR looks good in a most of its launch titles - since then, its been handed-off to "insert random summer intern here" , with varying levels of crappy results!

DLSS is at least more consistent from one game to the next ! You cant somehow pretend that just because AMD's solution is open-source, that the devs care not going to rewrite it for every game - that's why we have over TWO DOZEN actively-maintained windowing environments on Linux - plus several times as many package management systems as anywhere else!

So, do you really trust some random game intern to figure thing out oin hs own every time, orf do you evpect "yet another nmess" mevery time they get te freedom?
So far only God of War had subpar FSR 2.0, and unless you can give me factual proof that “some interns” are handling it I’m not believing that either. If some modder can implement FSR 2.0 in CP2077 by modifying a few dlls, I don’t think it’s that hard to implement if the game already has the subroutines needed for it. So I don’t agree at all.
 
You guys realize this is completely irrelevant for consumers, right ?
SSSsshhhh no one here knows ROCm is linux only.

btw, this is supported on mi100/200 already.
This extension is getting added to RDNA3 with most likely 3rd gen matrix math cores, or they could be the 2nd gen in CDNA2... I am assuming 3rd gen as AMD tends to iterate rapidly.

Currently there is Zero ROCm support for windows, maybe that will change as AMD desires more inroads to ROCm environment.

RT is a joke in its current state, everything looks wet....
I do enjoy broadcast and nvenc tho.
 
Last edited:
Can you link where AMD ever claimed otherwise?
Not AMD, but their volunteer marketing department vocal fans certainly have.
 
I wonder how much this addition plus the other potential RDNA3/2 differences add to the transistor budget.
If a 64RB/4096SP/128MB I.C. design (8GB/128bit bus) has the same QHD performance as 6900XT, a potential Navi34 64RB/2048SP/64MB I.C. design (8GB/128bit bus) would be close to RX6700XT (QHD) level logically and a potential Navi35 32RB/1280SP/24MB I.C. design (6GB/96bit bus) at least to RTX 3050 level in FHD, so they could theoretically replace the 7nm RDNA2 designs (Navi22/23) and leave only the 6nm based Navi24 in the market eventually.
 
Last edited:
Not AMD, but their volunteer marketing department vocal fans certainly have.

Ok but those can still claim that....that still does not warrant the remark at all.
 
Ok but those can still claim that....that still does not warrant the remark at all.
Perhaps, I'd like to think I'm better than that, but those individuals I reference, some on this very forum, aren't.

And to be honest I'm getting sick of politely making rational thoughts and arguments about such things only to be met with that, it seems like my statements hit harder, and I get more fake internet points when I phase my thoughts more provocatively.
 
Perhaps, I'd like to think I'm better than that, but those individuals I reference, some on this very forum, aren't.

And to be honest I'm getting sick of politely making rational thoughts and arguments about such things only to be met with that, it seems like my statements hit harder, and I get more fake internet points when I phase my thoughts more provocatively.
Note my sig ;)

Don't let the internet get to you, is all I can say. Everything you do on it is going to be met with resistance, even just simply because there are a million perspectives.
 
Don't let the internet get to you
You're not wrong, and fwiw we've disagreed sometimes in the past, but for the most part I have 0 beef with you, you engage in respectful discussion even in disagreement. So yeah, I've let it get to me but perhaps no more? it gets tiresome to always put the best foot forward when others are unwilling to do the same, and seem to get rewarded for it. may as well do as they do / say whatever I want with no filter, in my own way.
 
Ray tracing is a joke anyway, no one is missing much for not having it. It's true we hadn't much AAA releases, still after so long we have but a few games that done anything meaninfull with it. Seems more like a must have buzzword for the box than anything else

The most useful thing about ray tracing is that it lower development costs for games which is pretty big imo.
 
The most useful thing about ray tracing is that it lower development costs for games which is pretty big imo.
Actually it increases costs. Why? Because not only do developers have to code for hardware that is not RT capable but they also have to make a separate RT version with optimized assets. The only other alternative is to make RT hardware a requirement and cut off a large portion of gamers like with Metro Exodus EE.
 
Actually it increases costs. Why? Because not only do developers have to code for hardware that is not RT capable but they also have to make a separate RT version with optimized assets. The only other alternative is to make RT hardware a requirement and cut off a large portion of gamers like with Metro Exodus EE.
It's inevitable.
Probably in mass when the next cross-gen period ends.
If we assume end 2024 for refreshed PS5/XboxsX (logically it's happening despite what many like DF says imo) and end of of 2028 for PS6/XboxsX2 and the cross-gen period is 2 years, at 2030 we will be starting to see unreal engine 6 games or other next gen engines/games to support only RT capable hardware.It may see far away but it's less period than what has passed since PS4 launched.
If you use unreal engine 5 now you already enjoying all the costs benefits anyway irrespective of raytracing.
The problem is that it's heavy, so even if Epic manages to improve the engine performance, you are probably targeting RTX 3050 and above for 1080p/30fps near minimum setting and cards like RX6400 they may need 720p/30fps minimum setting if Xbox series S will be at 720-900p/30fps for example.
If Epic improves the engine and manages to achieve 60fps on PS5 even on 1080p (there are other PS5 1080p 60fps games even now) (before 1.5 year UE5 was at 1440p 30fps level in PS5) then since PS5 is like a Sapphire RX 5700 XT Nitro+ SE regarding raster, for 4K 60fps with the same quality settings, we will need a card around 1.5X faster than a ASRock RX 6900XT OC Formula.
If you increase the quality settings in PC version, it may not be enough even a full AD102 or a liquid Navi31 refresh to achieve 4K 60fps max settings gameplay (depending how low in quality settings Epic went to achieve 1080p 60fps on PS5, if it ever achieves it)
 
Last edited:
Yeah but not anytime soon. Remember how long a go DX12 was introduced? (2015). And we still have games releasing here, today that support DX11 or are even DX11 exclusive. We have DX12 compatible cards that are more than a decade old (HD 7xxx) and they are not yet obsolete.

Maybe RT adoption will be faster but i doubt anything will be as fast as FSR/DLSS adoption speed.
 
Yeah but not anytime soon. Remember how long a go DX12 was introduced? (2015). And we still have games releasing here, today that support DX11 or are even DX11 exclusive. We have DX12 compatible cards that are more than a decade old (HD 7xxx) and they are not yet obsolete.

Maybe RT adoption will be faster but i doubt anything will be as fast as FSR/DLSS adoption speed.
The first real DX12 cards were RX 200 series though (and 900 series with nvidia), feature level 12_0. There are DX12 games now that don't start on 7970 because it doesn't have 12_0.
 
The first real DX12 cards were RX 200 series though (and 900 series with nvidia), feature level 12_0. There are DX12 games now that don't start on 7970 because it doesn't have 12_0.
If i remember correctly, the only mandatory feature that is missing from Tahiti (GCN 1.0) is Tier2 tiled resource (Hawaii (GCN 1.1) and newer?)
 
If i remember correctly, the only mandatory feature that is missing from Tahiti (GCN 1.0) is Tier2 tiled resource (Hawaii (GCN 1.1) and newer?)
Could be, newest Dirt 5? isn't starting on anything older than 12_0 supported cards.
 
Sell the card and buy a 16 GB Radeon RX 6800 XT or a 12 GB Radeon RX 6700 XT.

Why did you buy it in the first place?
Why bother ? I am not a fanboy, if the card plays games, i am fine with it.
 
Same with RT then, AMD's implementation of RT is just weak sauce.

Tensor cores is on its 4th gen with Ada now, probably takes less than 5% die space.



Well if money is everything to you, then why are you spending them on useless PC stuff anyways.
May be the forth gen Tensor cores have a hand in the crazy power requirements then. If you have not realized, processing units are getting more and more power hungry, especially so in the last 5 or 6 years.

In my opinion, there are pros and cons with each design. The space wasted on RT and Tensor cores could have resulted in a higher CUDA core count, which directly improves performance. Whereas the likes of RT is not a feature that everyone will use in every game. I am using a RTX card, but most of the games I play don't feature RT. Tensor core can be helpful to have to enable DLSS, but as we see with FSR 2.0, Tensor core is nice to have, but not a must have to claw back performance and still maintaining decent image quality. Again, unless an image is rendered terribly wrong, most people will not notice the difference between DLSS and FSR 2.0 in games.
 
May be the forth gen Tensor cores have a hand in the crazy power requirements then. If you have not realized, processing units are getting more and more power hungry, especially so in the last 5 or 6 years.

In my opinion, there are pros and cons with each design. The space wasted on RT and Tensor cores could have resulted in a higher CUDA core count, which directly improves performance. Whereas the likes of RT is not a feature that everyone will use in every game. I am using a RTX card, but most of the games I play don't feature RT. Tensor core can be helpful to have to enable DLSS, but as we see with FSR 2.0, Tensor core is nice to have, but not a must have to claw back performance and still maintaining decent image quality. Again, unless an image is rendered terribly wrong, most people will not notice the difference between DLSS and FSR 2.0 in games.

Higher CUDA cores --> higher power consumption for linear performance increase, so it's not free performance.

Whereas I have been playing 4K DLSS Balanced on 4K OLED for 2 years already, almost no visual difference for 2x the FPS, without the need for 2x the CUDA cores and 2x the power consumption.

People who love to nitpick DLSS vs Native are now saying FSR2.0 is just as good as DLSS LOL, yeah sure...

As for RT i love playing AAA games that push the visual boundaries, that means RT ON for best visual immersion. Maybe in a few years you will appreciate RT ;)
 
Last edited:
May be the forth gen Tensor cores have a hand in the crazy power requirements then. If you have not realized, processing units are getting more and more power hungry, especially so in the last 5 or 6 years.

In my opinion, there are pros and cons with each design. The space wasted on RT and Tensor cores could have resulted in a higher CUDA core count, which directly improves performance. Whereas the likes of RT is not a feature that everyone will use in every game. I am using a RTX card, but most of the games I play don't feature RT. Tensor core can be helpful to have to enable DLSS, but as we see with FSR 2.0, Tensor core is nice to have, but not a must have to claw back performance and still maintaining decent image quality. Again, unless an image is rendered terribly wrong, most people will not notice the difference between DLSS and FSR 2.0 in games.
I don’t think Tensor cores reduce CUDA amount, it’s just extra die size and higher cost . The way a GPU engine is designed isn’t dependent on Tensor cores, you can’t simply erase tensor and add CUDA instead, everything is fixed. Per engine you have X cores. And RT cores are needed anyway.

People who love to nitpick DLSS vs Native are now saying FSR2.0 is just as good as DLSS LOL, yeah sure...
In some proper implementations it was, I would say DLSS is overall slightly ahead though.
 
Higher CUDA cores --> higher power consumption for linear performance increase, so it's not free performance.
Whereas I have been playing 4K DLSS Balanced on 4K OLED for 2 years already, almost no visual difference for 2x the FPS, without the need for 2x the CUDA cores and 2x the power consumption.
An excellent point, and it's evidently quite far from wasted space with not only that example, but the industry also moving in that direction.
People who love to nitpick DLSS vs Native are now saying FSR2.0 is just as good as DLSS LOL, yeah sure...
DLSS is closer to native than FSR is close to DLSS imo, what I've tested it in so far is nice at the best of times, and horrendous at the worst of times. I really hope it can keep improving.

And for real, DLSS got absolutely dragged through the mud for every little nitpick possible, as well as the overarching sentiment/s that "native or bust, I just want a card that offers the performance natively, native cannot be exceeded" etc, only now FSR is seen by such people as "just as good" and the best thing since sliced bread. And even though you have DLSS to thank for FSR's existence (and to be easily modded inclusion into DLSS supported games), apparently DLSS absolutely needs to immediately go away and die already, "why would dev's bother". Nice one.
As for RT i love playing AAA games that push the visual boundaries, that means RT ON for best visual immersion. Maybe in a few years you will appreciate RT ;)
Same, and it's obviously divisive, but I also want to play with cutting edge visuals and rendering techniques, that's a massive part of my love of this hobby. I can of course see how that might not appeal to people, but there certainly are those of us whom it does appeal to.
 
And for real, DLSS got absolutely dragged through the mud for every little nitpick possible, as well as the overarching sentiment/s that "native or bust, I just want a card that offers the performance natively, native cannot be exceeded" etc, only now FSR is seen by such people as "just as good" and the best thing since sliced bread. And even though you have DLSS to thank for FSR's existence (and to be easily modded inclusion into DLSS supported games), apparently DLSS absolutely needs to immediately go away and die already, "why would dev's bother". Nice one.
DLSS 1.0 was dragged through the mud, but so was FSR 1.0 from many people. It was just praised for other reasons, mainly compatiblity and open-ness, but picture quality more or less only in 4K Q/UQ, and 1440 UQ, else the IQ was pretty much criticized.
DLSS is closer to native than FSR is close to DLSS imo, what I've tested it in so far is nice at the best of times, and horrendous at the worst of times. I really hope it can keep improving.
Most people disagree however, FSR 2.0 can pretty easily compete with DLSS 2.5 even. Maybe try to see it without wearing green glasses. You should welcome competition and not support only one company.
 
DLSS 1.0 was dragged through the mud, but so was FSR 1.0 from many people. It was just praised for other reasons, mainly compatiblity and open-ness, but picture quality more or less only in 4K Q/UQ, and 1440 UQ, else the IQ was pretty much criticized.
So was/is DLSS 2.x, some of which you'll find in these very forums. I do notice it's dropped off significantly after FSR started doing roughly the same thing however.
Most people disagree however, FSR 2.0 can pretty easily compete with DLSS 2.5 even. Maybe try to see it without wearing green glasses. You should welcome competition and not support only one company.
How can FSR compete with a DLSS version that doesn't exist?

I have done, and facilitated for several others double blind tests in God of War and we were all able to clearly identify which solution looked better, and it was always DLSS. No slow motion, no 400% zoom, no green tinted glasses worn.

I've said it before and I'll say it again, I am all for a totally open solution being the lasting one that makes it into games above/instead of any others, I just also happen to want it to be excellent. Right now as a DLSS user, FSR 2.0 is a visual and often performance downgrade. This can obviously be fixed, but it's not there yet.

I do welcome competition, and I don't support only one company, what a ridiculous assumption and statement to make.
 
So was/is DLSS 2.x, some of which you'll find in these very forums.
I can't speak for this forum since I'm new but everywhere else I saw it mostly praised, but there are always the "native freaks" that will criticize any upsampling technology, you can't help it.
How can FSR compete with a DLSS version that doesn't exist?
Yea my mistake, I meant 2.4.
I have done, and facilitated for several others double blind tests in God of War
Coincidentally the worst game for FSR 2.0 with a botched implementation. I've read all the reviews, and so far, of the official implementations, only this one was botched.
 
Coincidentally the worse game for FSR 2.0 with a botched implementation. I've read all the reviews, and so far, of the official implementations, only this one was botched.
Farming Simulator also has horrible ghosting, and the disocclusion artefacts are present in Deathloop just not as bad, and for me and many others, they are very distracting. Plus the God of War dev's were keen as mustard to get FSR 2.0 in, I don't believe it's a 'botched' implementation at all, just an early one. I think FSR needs work and to be better tuned.
 
Back
Top