• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Set to Announce FSR 2.0 Featuring Temporal Upscaling on March 17th

Joined
Sep 20, 2019
Messages
478 (0.28/day)
Processor i9-9900K @ 5.1GHz (H2O Cooled)
Motherboard Gigabyte Z390 Aorus Master
Cooling CPU = EK Velocity / GPU = EK Vector
Memory 32GB - G-Skill Trident Z RGB @ 3200MHz
Video Card(s) AMD RX 6900 XT (H2O Cooled)
Storage Samsung 860 EVO - 970 EVO - 870 QVO
Display(s) Samsung QN90A 50" 4K TV & LG 20" 1600x900
Case Lian Li O11-D
Audio Device(s) Presonus Studio 192
Power Supply Seasonic Prime Ultra Titanium 850W
Mouse Logitech MX Anywhere 2S
Keyboard Matias RGB Backlit Keyboard
Software Windows 10 & macOS (Hackintosh)
You are mistaken about some things I wrote.

I am aware of how FSR and scaling are supposed to work and I spent quite some time, especially with the 1050Ti, assessing performance and visual tradeoffs. I'll assume you've done the same and if you haven't, you should try and see for yourself.

I agree that using lower quality modes past Ultra Quality should result in steadily improving FPS but these improvements are small compared to the initial big FPS boost from native to UQ.

I wrote that Quality gives almost no FPS improvement over Ultra Quality. You seem to assume I meant it gave a lower FPS improvement over Native, which is not the case. As the image quality in Quality mode is noticeably lower than Ultra Q mode, I feel that UQ is the sweet spot for FSR 1.0, at least for the GPUs, games, and display resolutions I tested.



If you're playing CP2077 on a 1050 Ti, Ultra Quality makes it reasonable to play at 1080p as the FPS improvement is better than the image fidelity loss. IMO of course, others may not like that tradeoff. However in the same computer, I felt that the TAA implementation in HZD was really distractingly bad in combination with FSR to where I mildly preferred the generally soft but not distracting presentation of 80% raw scaling over FSR UQ or Q at 1080p. Honestly, I didn't like either and even though HZD is a great game, I'd just play something else until I got a better card.


OK now I got ya! I did confuse what you meant, given the way it was written, I thought your comments were all in comparison to native so I thought you were confused about the settings. we're on the same page now :)

funny you say that, I have done the same approach on certain cards and games. when I got AC Odyssey it didn't play too hot on a RX 580 unless like 1080p low settings, not very good looking on a 4K TV, also didn't help I was on like a 10yr old X58 system at that time too so CPU bottleneck was a real problem, so I waited until I got a RX 5700 XT which went into a whole new Z390 system and could play something like 1440p high settings instead. sometimes, it's worth the wait to enjoy playing the game with better hardware. but FSR is here to help gamers in that exact situation, so I think it's a great piece of tech that hopefully can only get better from here.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
maybe we should all stop bitching and just be thankful these companies are still trying to help gamers during the time of shortage? i mean nothing was stopping Nvidia or AMD from just saying fuck all gamers, lets go all in on crypto short term and make that money, the gamers will come back when we tell them its time to come back.

nothing was stopping them, literally they are a duopoly.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,756 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Nothing is better than native
This statement isn't true at all, why do people still say this?
isn't 4x DSR / DL-DSR better than native?
Yes.
Its true, you can get better than native now.
Absolutely we can.

Just saying "native" as an all encompassing, never able to be surpassed level of quality is misleading at best. What resolution? what pixel density? what display technology? What AA method? what game? in motion? the list goes on.

The sentiment needs to die already. There are multiple ways to improve IQ over rendering the exact number of pixels, in the 'traditional way of your display natively (and more often than not, with an AA pass).
 
Joined
Mar 18, 2015
Messages
178 (0.05/day)
Nothing is better than native, I think because DLSS can resolve some details better than native (but it ruins the image in many others) they are referring to this.
A lot of people just seem blind when it comes to the artifacts that DLSS introduces. I've heard many people say that Cyberpunk 2077 looks "better than native" with DLSS enabled, yet every time I enable it, even on Quality mode at 1440p, all I see is pixels crawling all over the place on distant fine detail and odd flickering on certain textures. Can't say the ghosting ever really bothered me though, even before they improved it.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,756 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
A lot of people just seem blind when it comes to the artifacts that DLSS introduces. I've heard many people say that Cyberpunk 2077 looks "better than native" with DLSS enabled

Often in games, Native is not artefact free (or to frame another way, it's not perfect), there can be shimmering, ghosting, crawling, popping textures, LOD changes etc... So, DLSS may fix some, and introduce others, hence why some would claim better than native, because the artefact/s their eyes are most drawn to are solved, perhaps at the expense of others, but they might be missed. As well as the typical aspects we know it does well, fine detail and wire fences etc, those aren't really debated, and fine detail has that ability to grab the eye and impress.

Another factor might be something like using a slow VA panel, people might not see any ghosting because their panel produces it anyway so they're just used to it all the time, and DLSS doesn't seem to make it any worse.

I'm not blind to how DLSS can negatively affect an image, but it can certainly be said that aspects of the image are positively improved upon. One of my top bugbears is shimmering, with the AA in DLSS, it manages to often greatly reduce or even completely solve shimmer. So to me, native presentation with distracting shimmer, vs a DLSS implementation with that solved the shimmer, but perhaps has some pixel crawl or minor ghosting, I'll almost certainly find the DLSS image preferable - before even counting the extra FPS given.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,953 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
maybe we should all stop bitching and just be thankful these companies are still trying to help gamers during the time of shortage? i mean nothing was stopping Nvidia or AMD from just saying fuck all gamers, lets go all in on crypto short term and make that money, the gamers will come back when we tell them its time to come back.

nothing was stopping them, literally they are a duopoly.
Maybe you should just speak for yourself because honestly... what?! This is a truly dumb statement in every possible way.

This statement isn't true at all, why do people still say this?

Yes.

Absolutely we can.

Just saying "native" as an all encompassing, never able to be surpassed level of quality is misleading at best. What resolution? what pixel density? what display technology? What AA method? what game? in motion? the list goes on.

The sentiment needs to die already. There are multiple ways to improve IQ over rendering the exact number of pixels, in the 'traditional way of your display natively (and more often than not, with an AA pass).

Its true, especially the bit about motion clarity, but there is also perception and getting comfortable with certain methods of rendering, in much the same way that applies to 'buying a new monitor' and getting used to a different number of pixels at a different pitch. But I will concede that DLSS and FSR can be brought to a setup where there are only advantages. Its certainly a step forward.

And there is the implementation. I remember the Elder Scrolls Online, because the assets themselves are so low detail/low texture quality, the whole thing kinda falls apart and becomes an even more blurry mess. That's probably a game/engine/level of detail where the native sharpness adds fidelity, even if it also has some aliased edges. Sharpening passes in that game (I used to screw around with reshade and stuff on it way before DLSS) were horrible too even back then, the best effort was adding some mild juice to the bland color pallette.
 
Last edited:

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,000 (4.60/day)
Location
Kepler-186f
Maybe you should just speak for yourself because honestly... what?! This is a truly dumb statement in every possible way.



Its true, especially the bit about motion clarity, but there is also perception and getting comfortable with certain methods of rendering, in much the same way that applies to 'buying a new monitor' and getting used to a different number of pixels at a different pitch. But I will concede that DLSS and FSR can be brought to a setup where there are only advantages. Its certainly a step forward.

And there is the implementation. I remember the Elder Scrolls Online, because the assets themselves are so low detail/low texture quality, the whole thing kinda falls apart and becomes an even more blurry mess. That's probably a game/engine/level of detail where the native sharpness adds fidelity, even if it also has some aliased edges. Sharpening passes in that game (I used to screw around with reshade and stuff on it way before DLSS) were horrible too even back then, the best effort was adding some mild juice to the bland color pallette.

yawn
 
D

Deleted member 185088

Guest
Of course it can be better than native. When you change the AA implementation, like DLSS does, you can end up with much better IQ even if you're rendering from a lower resolution. Some games look awful at native.
It depends globally it's worse, but for some fine details DLSS can improve things due to its nature.
I wouldn't call that better than native. :laugh:
Of course not, but some reviewers mislead people and say it's better. It does improve
isn't 4x DSR / DL-DSR better than native? @Xex360

View attachment 239962
You got me there!
 
Joined
Mar 21, 2016
Messages
2,198 (0.74/day)
It's worth pointing out ATI has had temporal sparse grid AA since like the X800 series so we knew this was coming eventually.
 
Top