• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Elusive FidelityFX Super Resolution Coming This June?

Joined
Jun 3, 2010
Messages
1,914 (0.48/day)
I like the mention of MadVR. A while back I was wondering if staircase effect could be mitigated using its filters. In older games with shader injection to integrate hardware antialiasing a viable option, 2xSSAA gave me the effect I was looking for: clear pixels and only blurring where there would be the staircase effect. If I could, I would place an analytical filter after the blend stage of 2xRGSSAA in post processing. That would be my best bet, although more advanced goals of mine would have been to replace quad helper pixel oversampling "staircase effect" right from the start. I don't know about computer graphics, but I don't think vertex is such a limited resource to be compromising all later pixel stages - they are now calculated at unified shaders anyway. Just nip the problem in the bud, if there are no quad oversampling artifacts, there won't be a need for postprocessing analytical filters to correct that problem in a high-level stop gap solution anyway. I don't know... maybe, consider it. Perhaps the numbers will prove it is less overhead to throw quad shading down into the junkpile as a whole.
 
Joined
Jan 8, 2017
Messages
6,753 (4.18/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
In preference, I prefer to try and encourage people to have a more balanced and fair outlook on things - rather than just dismissing something as being garbage before it's even released.
No you don't, you encourage them to shut up because supposedly they're not engineers and therefore can't have a valid opinion on anything. As if you need intricate knowledge on how something is constructed to judge the quality of the end product, but anyway.

You said you are a software engineer, that doesn't mean anything in particular. Did you work with ML frameworks and write similar software to DLSS ?
 

clopezi

New Member
Joined
Sep 27, 2020
Messages
13 (0.05/day)
I maintain that DLSS marketing is fake, DLSS 4k isnt 4k, its 1080p (depending on the setting) upscaled, so comparing 4k vs 4k DLSS is just nonsense, it should be instead about comparing 1080 vs 1080 with DLSS upscaled and then talking about the image quality improvements (just like how any form of AA improves the image quality).

apart from that, I think stuff like DLSS should be about what its meant for, making Ray Tracing easier to do and so imo DLSS or equivalent should reduce and then upscale the resolution of Ray Traced tech only, so those ray traced reflections for example, make it quarter res so its easier on the hardware and then use DLSS etc to upscale that so it does not look like crap, and do the same for shadows etc.
Sorry but you are wrong.

Please, check this:

Screenshot_3.png


No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.

photo_2021-05-06_23-05-35.jpg


The tech it's awesome.
 
Joined
Mar 28, 2020
Messages
805 (1.82/day)
Whoa. So it is coming this June? Not bad. I really am curious how this FSR will stand against the DLSS 2.0 and 2.1. If it actually is any good. But the idea FSR working for both AMD and NV cards, easier to implement to any game, is a damn killer if this feature turns out to be good or at least close to what DLSS 2.0 offers. A damn killer I tell ya'll.
From the way AMD is selling FSR, it is certainly a more attractive proposition that Nvidia's bespoke DLSS. Having said that, I am not expecting FSR to beat DLSS, but as long as it is 75% close and easier to implement, I think DLSS may have a hard time going forward.
 
Joined
Sep 15, 2011
Messages
5,584 (1.57/day)
Processor Intel Core i7 3770k @ 4.3GHz
Motherboard Asus P8Z77-V LK
Memory 16GB(2x8) DDR3@2133MHz 1.5v Patriot
Video Card(s) ZOTAC GAMING GeForce RTX 3080 Trinity
Storage 59.63GB Samsung SSD 830 + 465.76 GB Samsung SSD 840 EVO + 2TB Hitachi + 300GB Velociraptor HDD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Anker
Software Win 10 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
As an owner of an RTX 3080 , let me put it clear as without any doubt. DLSS 2 is good ,but not in a million year within the same quality as the native render. And yes, I test it on all supported games including Control, Death Stranding or Cyber Punk... Nah, forget about the screenshots. The low res feeling is instant when switching to DLSS, even disturbing sometimes.

P.S.
As for the "marketing" clips that shows DLSS 2 4K image looking better than the native 4K one, I only have 1 word for you: SHARPNESS.
They just add extra sharpness to fake the crispness of the texture, but if you look closer to the images you can clear distinguish the blurred jaggies due to the low res upscaled.
 
Joined
Sep 25, 2019
Messages
66 (0.11/day)
Processor AMD Threadripper 2950X
Motherboard ASUS ROG Zenith Extreme
Cooling Custom hard-line water HeatKiller Pro CPU+VRM, Dual Radiator, EK Res'/fittings
Memory 64GB Corsair Dominator RGB
Video Card(s) Liquid Devil 6900XT
Storage Samsung 512GB Pro M.2. Intel 1.2TB PCI NVMe, 3 X 512GB SSD
Display(s) ASUS ROG 34" Curved UW 3440x1440 100Hz
Case Thermaltake View 71T (10 Fans)
Power Supply Corsair HX1000i
Mouse Logitech G502
Keyboard ASUS ROG
Software Win 10 PRO x64
Benchmark Scores Cinebench R20: 7984
Sorry but you are wrong.

Please, check this:
Are you sure those images are correct, I don't recall my 4K image being so poor as this one appears to be.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,973 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
As an owner of an RTX 3080 , let me put it clear as without any doubt. DLSS 2 is good ,but not in a million year within the same quality as the native render.
As valid as that view is, it is in fairly stark contrast to how others view it in the titles that show it best. I'd wager if you blind tested a large sample of gamers and didn't butter them up for what they were looking for, the result would surprise you. I can't deny your own experience and what your eyes see, but DLSS 2.0 quality mode can be exceptionally close to native all things considered, with strengths and weaknesses at various points of the image, but all mostly needing to be nitpicked to tell.

Plus, if that's how you feel about DLSS. .. I don't think you'll be impressed with FSR IQ. I hope I'm wrong.

I maintain that DLSS marketing is fake, DLSS 4k isnt 4k, its 1080p (depending on the setting) upscaled, so comparing 4k vs 4k DLSS is just nonsense
All that really matters is the output image on your screen, marketing is always going to be marketing.

To me the output is all the matters, if it's comparable to native, who cares one bit what the input resolution is?

I'm not bothered at all how the magic pixels make it to my screen, the image I get at the end is the important part. Rendering is all ridiculous computer magic to most anyway, judge the final image as the final image.
 

clopezi

New Member
Joined
Sep 27, 2020
Messages
13 (0.05/day)
Are you sure those images are correct, I don't recall my 4K image being so poor as this one appears to be.
Ask Eurogamer and DF team!

But the point it's that the tech it's fantastic and it's great news AMD users can also enjoy similar solutions, I hope
 
Joined
Jul 9, 2015
Messages
2,857 (1.32/day)
System Name My all round PC
Processor i5 750
Motherboard ASUS P7P55D-E
Memory 8GB
Video Card(s) Sapphire 380 OC... sold, waiting for Navi
Storage 256GB Samsung SSD + 2Tb + 1.5Tb
Display(s) Samsung 40" A650 TV
Case Thermaltake Chaser mk-I Tower
Power Supply 425w Enermax MODU 82+
Software Windows 10
For starters, it doesn't require training from a generative adversarial network (GAN), and doesn't rely on ground truth data.
For starters, neither does NV's TAA derivative, also known as DLSS 2.0.

Please, check this:
All TAA derivatives, and DLSS 2 among them, will expose the following "features":
1) Improved lines (long grass, hair, eyebows, etc)
2) Not that noticeable with blurry textures, that is why people keep repeating that face from that weird game that looks like it's from 2003, as it barely has any texture detail, but has hair
3) Wiping out fine details
4) Adding blur when things move fast (entire screen can be blurred in Death Stranding by just quickly moving the mouse, see arstechnica review)
5) Particularly bad with small, quickly moving objects
 
Joined
Jul 3, 2019
Messages
174 (0.24/day)
Location
Bulgaria
Processor 6700K
Motherboard M8G
Cooling D15S
Memory 16GB 3k15
Video Card(s) 2070S
Storage 850 Pro
Display(s) U2410
Case Core X2
Audio Device(s) ALC1150
Power Supply Seasonic
Mouse Razer
Keyboard Logitech
Software 20H2
There are still people that believe and push Nvidia's marketing propaganda that DLSS2 is better than native? HILARIOUS!
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,973 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
There are still people that believe and push Nvidia's marketing propaganda that DLSS2 is better than native
I believe what I see with my own eyes, yes. In well executed titles, it does look better to my eyes, and it performs better. FRS can hope to achieve the same.
 
Joined
Jun 23, 2011
Messages
357 (0.10/day)
System Name potato
Processor Ryzen 9 3950X
Motherboard MSI MAG B550 Tomahawk
Cooling Custom WC Loop
Memory 2x16GB G.Skill Trident Z Neo 3600
Video Card(s) Radeon VII
Storage Team Cardea II 512GB + HDD 2x WD VRaptor 500GB & 2x WD Blue 4TB
Display(s) XIAOMI Curved 34" 144Hz UWQHD
Case be quiet dark base pro 900
Audio Device(s) Logitech G733
Power Supply Corsair AX860i
Mouse Logitech G Pro
Keyboard Corsair K65
Software win 10 amd64
I believe what I see with my own eyes, yes. In well executed titles, it does look better to my eyes, and it performs better. FRS can hope to achieve the same.
well i'm certain it doesn't look better than native to me, though i agree it performs way better.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,973 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
well i'm certain it doesn't look better than native to me, though i agree it performs way better.
Well a final output imagine is nuanced, especially in motion, there is lots to unpack there. For instance I find shimmering on straight(er) edges is a nit-pick of mine where DLSS helps immensely, along with intricate details, perhaps other parts of the image matter more you you? I'd encourage you to come join the conversation here if you have extended thoughts to share or anything you want to discuss with other other people that use DLSS 2.0.
 
Joined
Jan 14, 2019
Messages
801 (0.91/day)
Location
United Kingdom
System Name Nebulon-B Mk. 2
Processor Intel Core i7-11700
Motherboard ASUS TUF Gaming B560M-Plus (WiFi)
Cooling be quiet! Shadow Rock LP
Memory 2x 16 GB Corsair Vengeance LPX 3200 MHz CL16
Video Card(s) ASUS GeForce GTX 1650 4 GB LP OC
Storage 1 TB Corsair MP400, 512 GB ADATA SU900
Display(s) Samsung C24F396
Case AeroCool CS-101, 2x 8 cm Akasa slim fans
Audio Device(s) Genius SP-HF160 speakers, AKG Y50 headphones
Power Supply SilverStone SX300
Mouse Cherry MW 8
Keyboard MagicForce 68
Software Windows 10
Compatible with nvidia GPUs. That's the stuff! No more proprietary technologies please!
 
Joined
Feb 11, 2009
Messages
3,403 (0.76/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Sorry but you are wrong.

Please, check this:

View attachment 199551

No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.

View attachment 199552

The tech it's awesome.

it gets a "lot more juice" because you are not running it at 4k, that is what makes its so fake.

they know that saying its "1080p" upscaled sounds less good, instead they pretend its magic and their cards can do "4k" with Ray Tracing, but in reality they are doing less then 4k to be able to that.

my problem is the marketing, and I can only repeat myself, if they compared 1080p performance to 1080p performance, ya know, apples to apples, then say "yeah ok the performacne is the same between the two products BUT ours can do this but with our version of anti aliasing so it looks better" then that is fine.

But instead they knowingly say "our card with 4k DLSS does better then the competition at 4k".....well yeah because you are not running it at 4k now are ya?
 
Joined
Sep 28, 2012
Messages
785 (0.25/day)
System Name Potato PC
Processor AMD Ryzen 5 3600
Motherboard ASRock B550M Steel Legend
Cooling ID Cooling SE 224XT Basic
Memory 32GB Team Dark Alpha DDR4 3600Mhz
Video Card(s) MSI RX 5700XT Mech OC
Storage Kingston A2000 1TB + 8 TB Toshiba X300
Display(s) Mi Gaming Curved 3440x1440 144Hz
Case Cougar MG120-G
Audio Device(s) Plantronic RIG 400
Power Supply Seasonic X650 Gold
Mouse Logitech G903
Keyboard Logitech G613
Benchmark Scores Who need bench when everything already fast?
Some people just flat out hate any blur whatsoever, I get that, but per-object motion blur intends to mimic the way we actually see objects in motion (try waving your hand back and forth in front of your eyes, do you see a blurred hand or frame by frame snaps of a sharp hand?), I find turning it off games can look juddery, even at 100+ fps now.

I just want to ask where is your stance based on your argument, if you see from this picture that 4K native is more blurry than DLSS.

Sorry but you are wrong.

Please, check this:

 
Joined
Sep 8, 2020
Messages
72 (0.26/day)
System Name Home
Processor 2700x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 480 NITRO+
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply Chieftec 700W
Mouse Logitech g502 hero
Keyboard Logitech
Software Windows
I play around in photoshop for work from time to time and there is no trick or plugin in the world that can really upscale or AI enhance some image i throw at it, there is only perception from contrast and sharpening, but you see it.
What would work and you would hardly see a difference is taking a picture at 20 megapixels and downscaling it to 5 megapixels, looking at the same picture on a 2k monitor you can hardly see a difference if you don't start pixel peeping.
So we know DLSS only works when you play at very high resolutions but the question is why the game doesn't scale properly with the textures and all that.
I think DLSS has to start from a place of wasting resources and then comes the miracle technology that helps with the waste.
I'm not holding my breath for AMD doing much in this area or getting this on consoles in the future, if the game is properly made in the first place then there is no need for DLSS and other tricks.
 
Joined
Feb 8, 2017
Messages
95 (0.06/day)
What some people describe as "better" image is just a more bland image. Yeah the image is fuller in color, but its a more monotone color and it doesn't have as many pixels as the original image.

I'd like to compare it like using too much sharpening, when you blur the image it can sort of appear as if the image quality is better, so in some instances DLSS can appear to some people to have better image quality, but in reality it doesn't.

In fact in universally stated "Best game" for ray tracing and DLSS Cyberpunk 2077, DLSS quality is much worse than native. GamerNehus had a great video where he shows dozen of pictures at the beginning of the video and asks you to try and figure out which one is which, from native to DLSS performance to the max DLSS quality. Literally 99% of people knew the answers in terms of what is native vs DLSS.

Now in that video he showed small aspects where DLSS had slightly better text visibility, so there are some small, very specific improvements in image quality, but overall for 95% and more its easily distinguishable much WORSE quality than Native. We are also talking about rendering at 4k, which uses the best resolution, don't forget the further down you go, the worse image quality is going to be! So if you play on 1080p and use DLSS with that resolution, your DLSS output is going to be much worse than 4k.
 
Joined
Oct 12, 2005
Messages
163 (0.03/day)
Also, like All upscaling, the higher the resolution you want to display, the better the results.

Even a standard upscaling + Sharpening on a 4K monitor is fine and usable.
Same thing on a 1440P monitor, not that great but not that bad too. Somehow usable
on a 1080P monitor it's just bad.

that is true for all upscaling technology including DLSS. Digital Foundry (who have made many paid Nvidia sponsored presentation with very doubful claim) aren't telling you the full story and they only look at what can make thing shine better. A media that have real neutral and objective view wouldn't have made that video by example claiming that a 3080 was twice as fast as the 2080 TI. They say a lot of thing that is true, but you can mislead people by pointing the narrative into a specific direction and omitting all the negative.

It's clear that internally, DLSS add a sharpening filter. Apply it on the Native image and it will look even better. You don't need upscaling for that. Also i think Microsoft is trying to do a lot to trick people thinking there is a huge AI part into it because they want to convince people they need to buy the silicon space they added for their Pro/Business AI accelerator but many people doubt that part. Some Bug have shown that it probably use some TAA and it also ofset the rendering of each frame slightly so that the lower resolution capture a bit more detail each frame.

This is why fast moving object or scene are so blurry while they are able to get some tiny details better. They don't seems to reconstruct much, they just use previous frame data and if it move too quickly, there is no data to use.

Nothing that can't be implemented with a more open solution. And this is really where DLSS fail. That is a good technology. Everything that is around that can help to improve performance is good even if there are drawback. The main problem there for me isn't that it's a good or bad technology. It's a good tech with some drawback. The problem is that is a closed technology.

Closed technology are bad for PC gamer, end of the story.
 
Last edited:
Joined
May 3, 2018
Messages
383 (0.34/day)
As a photographer I use a program called Topaz Gigapixel AI to do upscaling of images. I regularly upscale my bird photos by around 30% and sometimes 50% and as long as my source material is good, I cannot tell the difference and often the upscaled image is better. The AI does a spectacular job of improving detail. In one comparison it was impossible to tell the difference between a native 24MP photo taken by one camera and the same photo taken with a 61MP camera when the smaller image was upscaled. This is the beauty of the new AI training, compared to old braindead bicubic upscaling. It's not just about sharpening low res upscaled data, the AI can help create improved detail because it knows what the texture should like.

The results I've seen for DLSS when upscaling from say 1440p-1800p to 4K are more than good to pass muster and given in a game you are not usually standing still looking for tiny flaws, I'll take image quality that looks almost as good but with 50% higher frame rates any day. I would never try and upscale 1080p to 4K, 1440p minimum.
 
Joined
Dec 22, 2011
Messages
3,501 (1.01/day)
System Name I'm sorry Dave, I'm afraid I can't do that.
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Crossover 27Q 27" 2560x1440 + Hisense 43" 4K
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
From the way AMD is selling FSR, it is certainly a more attractive proposition that Nvidia's bespoke DLSS. Having said that, I am not expecting FSR to beat DLSS, but as long as it is 75% close and easier to implement, I think DLSS may have a hard time going forward.

The key difference is those Nvidia users get to enjoy both.
 
Joined
Jun 3, 2010
Messages
1,914 (0.48/day)
The results I've seen for DLSS when upscaling from say 1440p-1800p to 4K are more than good to pass muster and given in a game you are not usually standing still looking for tiny flaws, I'll take image quality that looks almost as good but with 50% higher frame rates any day. I would never try and upscale 1080p to 4K, 1440p minimum.
That is what sells computer video cards, dear. Don't be silly consumer point of view. This is big business.
I do hope old computer graphics take a revival. Everything looked unique to itself. This AI optimisation can do its thing, however I still have my hope for new filtering modalities. Let's make MadVR the benchmark. There is much to benefit, if it at least overcomes this overshading 'bug' in the pipeline - much ado for nothing, sales pitch, graphics relegator, quad helper pixel, planned obsolescence artifact!
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
5,973 (1.16/day)
System Name MightyX
Processor Ryzen 9 5900X 5ghz
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Noctua NH-L12
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 OC/UV + duct
Storage Samsung 970 Evo m.2 NVME
Display(s) 21:9 3440x1440 144hz 1500R VA
Case Coolermaster NR200P
Power Supply Corsair SF600 Gold
Mouse Zowie EC1-A
Keyboard Razer Blackwidow X Chroma
Software case populated with Artic P12's
I just want to ask where is your stance based on your argument, if you see from this picture that 4K native is more blurry than DLSS.
Somewhat of a different bucket there. That's a blurrier (appearing lower resolution) image overall with very little to no movement happening at the time of the screenshot, so motion blur isn't needed. Per object motion blur shines when there is fast object movement, and camera blur would be the same, fast camera movement, not necessarily a blurrier image all the time.
 
Joined
Sep 8, 2020
Messages
72 (0.26/day)
System Name Home
Processor 2700x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 480 NITRO+
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply Chieftec 700W
Mouse Logitech g502 hero
Keyboard Logitech
Software Windows
As a photographer I use a program called Topaz Gigapixel AI to do upscaling of images. I regularly upscale my bird photos by around 30% and sometimes 50% and as long as my source material is good
Topaz plugins are exactly what i was talking about, you can see how hard it tries the so called "AI" and it creates fake pixels then gives a little boost with maybe selective contrast or selective sharpen for shadows, mids or highlights, i can see it and i don't like it.
In videography world there is a plugin called twixtor, it's like a 20 years old plugin that converts a video at let's say 25fps to 50fps, it analyzes the video and it creates new pixels in advance, pray you don't shoot trees or scene is not too complex, this does basically the same thing as modern plugins that called themselves AI powered.
I'm telling you, DLSS sponsored titles need to waste a lot of hardware resources so when you enable the miracle technology it "works" as intended.
 
Joined
Mar 28, 2020
Messages
805 (1.82/day)
Sorry but you are wrong.

Please, check this:

View attachment 199551

No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.

View attachment 199552

The tech it's awesome.
Again, I think DLSS is a great technology to balance performance vs image quality. However I do sometimes wonder if the game developers actually deliberately made the game look worst without DLSS, especially games like Control being an Nvidia sponsored title and seems to get the most spotlight from Nvidia when it comes to DLSS performance/ image quality. I would rather look at a game that is not Nvidia sponsored for this comparison. Blurriness aside, the reason why I ask this is because if I look at a game like Shadow of the Tomb Raider (this is also an Nvidia sponsored title), and compare the 4K image from Controls where the hair somehow looks mangled at the ends, I certainly don't see this issue with Lara's hair. In short, the 4K native image looks suspiciously bad. Its possible for DLSS to make some image sharper in some cases, but at the same time introduce other image issues, especially when your character is moving in the game. So there is no perfect solution.

The key difference is those Nvidia users get to enjoy both.
I feel that is how AMD is trying to hamper adoption of DLSS. When you have a viable alternative to DLSS that is easier to implement without significant compromise in quality, do you think the game developers will still want to spend time on DLSS, especially on day 1? It may come as a subsequent add in, but I suspect it will have a knock on impact on DLSS up take, unless the game is sponsored by Nvidia. While this is a not a good comparison, you can take the GSync vs FreeSync outcome as a reference. The latter is known to be not as good as GSync, but monitor makers have mostly utilized FreeSync over GSync. Yes there is significant cost involved in utilizing GSync which makes this a different comparison, but it also proves the point that if you have a viable alternative that is hardware agnostic/ cheaper, people will tend to go for that solution.
 
Last edited:
Top