• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Confronting NVIDIA's DLSS: AMD Confirms FidelityFX Super Resolution (FSR) to Launch in 2021

uh...40fps with 1080p Low settings? nah that's not playable, kind of a waste of money if you have a 1060 and then buy AAA games at full price. I'm sure almost all PS4/XBX owners would agree with me there :D
50 fps is perfectly playable we are not pcmr are we?
 
Y'all posting DLSS benchmarks at 1080p seem to forget that image quality goes down significantly with DLSS once you drop resolution to 1080p. You don't see it as much if you're at 4K.
 
Y'all posting DLSS benchmarks at 1080p seem to forget that image quality goes down significantly with DLSS once you drop resolution to 1080p. You don't see it as much if you're at 4K.
Does it? The actual resolutions are obfuscated and somewhat configurable but the scaling factor should be same or close regardless of resolution. I do not usually play on 2160p screen so not too sure about that but 1440p with DLSS and 1080p with DLSS do not have noticeable difference in image quality downgrade compared to native. I would suspect this has more to do with 2160p being large enough resolution that image quality problems in general do not stand out as much (and some games having different settings).

Granted, I am not using DLSS settings other than Quality, the image quality hit from others is way too severe for my taste.
 
Does it? The actual resolutions are obfuscated and somewhat configurable but the scaling factor should be same or close regardless of resolution. I do not usually play on 2160p screen so not too sure about that but 1440p with DLSS and 1080p with DLSS do not have noticeable difference in image quality downgrade compared to native. I would suspect this has more to do with 2160p being large enough resolution that image quality problems in general do not stand out as much (and some games having different settings).

Granted, I am not using DLSS settings other than Quality, the image quality hit from others is way too severe for my taste.
From DF:
1616143625323.png
 
Heard that one before, waiting to see it materialize...

Directx 12/Vulkan is entirely different ballgame, Hitman 3 is already showing proof of it.
 
DLSS add TAA in Nioh 2, the game doesn't include any Anti Aliasing option which make DLSS the default AA mode LOL.
Also developer forgot to add the in negative mipmap bias for DLSS, which DF pointed out.

Directx 12/Vulkan is entirely different ballgame, Hitman 3 is already showing proof of it.

Hitman 3 is just a re-skin hitman 2, which already favor AMD
Hitman 2 FPS 3840x2160


Other DX12/Vulkan games like RDR2, Rainbow 6 Siege and even PS4 ported games like Death Stranding, Horizon Zero Dawn don't favor AMD hardware.
 
DLSS add TAA in Nioh 2, the game doesn't include any Anti Aliasing option which make DLSS the default AA mode LOL.
Also developer forgot to add the in negative mipmap bias for DLSS, which DF pointed out.



Hitman 3 is just a re-skin hitman 2, which already favor AMD
Hitman 2 FPS 3840x2160


Other DX12/Vulkan games like RDR2, Rainbow 6 Siege and even PS4 ported games like Death Stranding, Horizon Zero Dawn don't favor AMD hardware.

didn't know that, thanks for point it out. hmm, oh well. im enjoying next gen gaming with my 6800 non-xt anyway, no waiting for me /shrug
 
Can't wait....

Wait a sec, I can already run Terraria at 4k 60fps ;)
 
Now if only they'd launch more GPUs before launching this...
 
w

dym they already have like 4 models out?
I meant it as in more quantity, sarcastically. Thought this was obvious.
 
yawn. yet another tech they copy. when will amd ever become a leader by development? never? they even copy Intel's mother board model numbering system by just tacking on a 100. if amd never had the massive investment by the middle eastern so called princes in 2008 and 09 they would have been gone. now that they have money, why are they taking so long to develop unique technologies rather than copying same or similar and just rename everything. come on amd, develop something exciting that no one else has, you have the backing with billions of filthy oil money, get on with it.
 
Last edited:
yawn. yet another tech they copy. when will amd ever become a leader by development? never? they even copy Intel's mother board model numbering system by just tacking on a 100. if amd never had the massive investment by the middle eastern so called princes in 2008 and 09 they would have been gone. now that they have money, why are they taking so long to develop unique technologies rather than copying same or similar and just rename everything. come on amd, develop something exciting that no one else has, you have the backing with billions of filthy oil money, get on with it.
Well those were certainly words..Vulkanl exists, Sapphire Boost exists. I can’t be arsed to go on with AMD tech innovations Zen 3 exists. The real difference here is AMD tries to make their tech as ope source and agnostic as possible
Your obvious bias could have stopped after your first sentence...
 
yawn. yet another tech they copy. when will amd ever become a leader by development? never? they even copy Intel's mother board model numbering system by just tacking on a 100. if amd never had the massive investment by the middle eastern so called princes in 2008 and 09 they would have been gone. now that they have money, why are they taking so long to develop unique technologies rather than copying same or similar and just rename everything. come on amd, develop something exciting that no one else has, you have the backing with billions of filthy oil money, get on with it.
This is literally the second post I see from you with the exact same opening. Will you contribute something somewhen?

Pretty sure AMD was the one innovating in 2017 forward while Intel were sleeping on their throne of money.
 
Horizon Zero Dawn don't favor AMD hardware
Both those titles run quite well DS runs fantastic and had CAS FX long before DLSS was added- HZD definitely has performance issues but I don't think they are limited to AMD cards but just the older engine version and poor optimization overall-
 
anyone else having a bit of Deja Vu?

Nvidia makes something new but makes it proprietary.
AMD makes something similair but makes it open.
The open version becomes the standard.
Lol shots fired !!! Hopefully this happens not because I'm a AMD fanboy but i just think the open standard is better for the user base. And i have a 3070 in my system right now.
 
yawn. yet another tech they copy. when will amd ever become a leader by development? never? they even copy Intel's mother board model numbering system by just tacking on a 100. if amd never had the massive investment by the middle eastern so called princes in 2008 and 09 they would have been gone. now that they have money, why are they taking so long to develop unique technologies rather than copying same or similar and just rename everything. come on amd, develop something exciting that no one else has, you have the backing with billions of filthy oil money, get on with it.
You might be too young to remember when AMD created the now industry standard 64-bit extension of the x86 instruction set, or when ATi (now AMD Radeon Technologies Group) beat nvidia's GeForce FX series into the dust with their Radeon 9000 series graphics cards, but I'm sure you've heard about Ryzen.

Truth is, there's always more than one company making the same kind of product (Coca Cola - Pepsi, McDonalds - Burger King, Ford - Chevrolet), but that doesn't make either of them a fake copy.
 
Being open in this case is a red herring.

The actual APIs can be as open as possible, but there's still a learning part involved in the training phase. In a completely open solution who performs the training? Who acts as a repository for the training results?

Sure, chances are we'll end up with an open API after all (even DXR was open from the beginning, even if Nvidia tacked on RTX). I'm just saying, the meat of these algorithms will remain in AMD's and Nvidia's hands.
Amd has stated their tech won't need training , time will tell but we still have no idea what it is.
 
Amd has stated their tech won't need training , time will tell but we still have no idea what it is.
If it doesn't involve training, it can only be some generic scaling algorithm. Possibly a bunch of them, each doing better than the rest in a given area.
 
The FidelityFX works nice with CyberPunk. It does boost the performance considering how demanding that game is. I wish AMD hurry with the FSR release but I know it's better to wait a bit longer to get it done right. I'm really curious how this one will work.

Hitman 3 is just a re-skin hitman 2, which already favor AMD
I kinda have the feeling with you, if there's a game that favors AMD it should have been disregarded immediately from being a GPU performance indicator. Can't see you say the same about NV.
Since we have 2 major companies making GPUs, wouldn't it be healthy to have both "favored" games in the benchmark mix?
 
The FidelityFX works nice with CyberPunk. It does boost the performance considering how demanding that game is. I wish AMD hurry with the FSR release but I know it's better to wait a bit longer to get it done right. I'm really curious how this one will work.
Was really good in Death Stranding as well. I”m using 95% in CP2077
 
Was really good in Death Stranding as well. I”m using 95% in CP2077
Never played Death Stranding but I do here it works very well. In CP I tried different configs with my GPU and to be honest I was surprised how well this feature works.
I'm just hoping the FidelityFX Super Resolution will bring even more to the table in terms of performance. Also, fingers crossed, it won't impact quality as much.
 
Never played Death Stranding but I do here it works very well. In CP I tried different configs with my GPU and to be honest I was surprised how well this feature works.
I'm just hoping the FidelityFX Super Resolution will bring even more to the table in terms of performance. Also, fingers crossed, it won't impact quality as much.
Yeah I mean i don’t know if 95% was default but I was well into the game when I noticed it so I’ve just left it there was already used to the IQ. Death Stranding as a whole was a really well optimized game but always turn on the “special sauce” anyway
 
Back
Top