• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Death Stranding with DLSS 2.0 Enables 4K-60 FPS on Any RTX 20-series GPU: Report

Joined
May 31, 2017
Messages
877 (0.35/day)
Location
Home
System Name Blackbox
Processor AMD Ryzen 7 3700X
Motherboard Asus TUF B550-Plus WiFi
Cooling Scythe Fuma 2
Memory 2x8GB DDR4 G.Skill FlareX 3200Mhz CL16
Video Card(s) MSI RTX 3060 Ti Gaming Z
Storage Kingston KC3000 1TB + WD SN550 1TB + Samsung 860 QVO 1TB
Display(s) LG 27GP850-B
Case Lian Li O11 Air Mini
Audio Device(s) Logitech Z200
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G305
Keyboard MasterKeys Pro S White (MX Brown)
Software Windows 10
Benchmark Scores It plays games.
The main hurdle with DLSS as I see it is cost. I don’t know if Nvidia changed their stance on it in the meantime, but at launch DLSS was a feature that game developers had to pay for.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I'm not that interested in an interpretation of what the Dev wanted me to play.
If it goes like dlss 1.0 ,Rtx or physx, hairworks or gameworks it's not going to shake the world.

Big meh.

Also old games scale better with resolution because they were made to actually render to the resolution, most modern games basically frame scale a 1080p world with a few, like GtaV allowing you the option to use 100% frame scaling.
 
Joined
Apr 21, 2010
Messages
562 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
Better not to go 4K/60 fps , otherwise users will be stuck on 4K res and they Can't go back to 1440p or 1080.different between two of them is massive if you experience them.
 
Joined
Dec 16, 2012
Messages
540 (0.13/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
Mouse Glorious Model D Wireless
Keyboard Ducky One 2
VR HMD Quest 2
Software Windows 11 64bit
Better not to go 4K/60 fps , otherwise users will be stuck on 4K res and they Can't go back to 1440p or 1080.different between two of them is massive if you experience them.

Good thing I havent played on a true 4k60 panel. My 2070s barely runs Ultrawide 1440p.

And yes, I played Control with DLSS 2 and all the RTX and it looks a little better than native res.
 
Joined
Nov 4, 2019
Messages
234 (0.14/day)
this is true
dlss 2.0 looks very good,but I must admit it's something your eyes gotta adjust to.

I would kill for Mario Odyssey in 4K 120fps, that is an example of game rendering that is raw sharp and pure. I personally will take jaggies over blur any day. One of the ways in which it is obvious what type rendering is happening is if you think the only blurriness is in texture resolution. Mario Odyssey with new 16GB VRAM textures? Would look perfectly sharp. Doesn't matter how high the textures you use, Control won't be sharp.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Nah he is just salty.

DLSS2.0 is really amazing. Great for performance boosting for GPUs that lacked that raw computation power. As for picture quality there are numerous testings showed DLSS 2.0 is the same or sometimes better than native resolution rendering.

The more DLSS 2.0 implementation, the better it is for end users with GPU that support it.
The thing people don't get about AI, is the more you train it, the better it gets. Like Google's search.
Already DLSS 2.0 shed that per title training DLSS 1.0 carried. I can only guess where DLSS 3 or 4 will go.

Picture quality, though, is subjective. Because there's no reference. It's just comparing an approximation of the desired result (no-DLSS) with another approximation (DLSS).
At the end of the day, it's just (clever) tricks that allows us to game at higher settings than the hardware could push otherwise. Anyone remembers how manufacturers were stuck for a while trying to do better SSAA until MSAA came along? Guess what, MSAA is still not on par with SSAA, but in the meantime we've come to call MSAA too taxing and willing to accept even more IQ compromises (e.g. TAA). Why? Because of performance.
 

lugaidster

New Member
Joined
Jul 1, 2020
Messages
2 (0.00/day)
I like how most media sites are stroking Nvidia's ego with this. Arstechnica's Death Stranding for PC preview says it supports both DLSS 2.0 and AMD's FidelityFX upscaling tech. More importantly, it says that FidelityFX's upscaler preserves more detail and provides more of a boost. Most importantly, FidelityFX is vendor-neutral: their tests were done on Nvidia hardware. I especially like the fact that FidelityFX works with Pascal while DLSS 2.0 doesn't.

So... seems to me DLSS 2.0 remains overhyped. I'm sure Nvidia's marketing money will bury this, though.

The thing people don't get about AI, is the more you train it, the better it gets.

This isn't strictly true. Depending on the model and the training set, it's very easy to get trapped into local optimums. The big issue with this is that it's very hard to know if you did. Moreover, when getting close to local optimums, more training does very little to improve inference result performance.

Don't get me wrong, DLSS is good tech, but it's still an approximation. There's just nothing inherent to it that makes it better at solving this particular problem that other approximations can't do.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This isn't strictly true. Depending on the model and the training set, it's very easy to get trapped into local optimums. The big issue with this is that it's very hard to know if you did. Moreover, when getting close to local optimums, more training does very little to improve inference result performance.

Don't get me wrong, DLSS is good tech, but it's still an approximation. There's just nothing inherent to it that makes it better at solving this particular problem that other approximations can't do.
I know that, you know that. I bet Nvidia knows that.
I imagine it's pretty easy to avoid the local optimum solution if you train multiple times, applying inputs in a different order. Or if you develop an automated tool to analyze artifacts for you.

What makes DLSS better is the computations for approximations are done by Nvidia on their servers and the driver only has to apply the result when rendering. It doesn't guarantee any level of quality (though I think for only a second version, it does pretty well), but it guarantees more performance.
If you worry about loss of quality, you should worry more about variable rate shading. And that's in DX, it's going to be everywhere.
 

lugaidster

New Member
Joined
Jul 1, 2020
Messages
2 (0.00/day)
I know that, you know that. I bet Nvidia knows that.

Yes, but the people you're trying to enlighten don't necessarily know that. Otherwise, why mention training is better in the first place, eh?

I imagine it's pretty easy to avoid the local optimum solution if you train multiple times, applying inputs in a different order. Or if you develop an automated tool to analyze artifacts for you.

It isn't. It's actually pretty hard because you don't know when you've actually gotten to a global optimum. At least not when training neural networks. You can easily weed out poor performers, but the same is not true once you get to a decent level of competence.

With respect to developing a tool to analyze artifacts, that's actually pretty hard too because artifacts are subjective. You're going to get artifacts regardless, which ones are actually acceptable?

If you had an accurate model that could easily allow you to identify a global optimum, you'd be able to build a mathematical model to achieve it and, thus, model it with a regular algorithm. You wouldn't need neural networks in the first place.

What makes DLSS better is the computations for approximations are done by Nvidia on their servers and the driver only has to apply the result when rendering. It doesn't guarantee any level of quality (though I think for only a second version, it does pretty well), but it guarantees more performance.

You've basically described all upscalers.

If you worry about loss of quality, you should worry more about variable rate shading. And that's in DX, it's going to be everywhere.

This doesn't even make sense. If I'm going to use an upscaler, why wouldn't I complain about quality if there's better upscalers out there?
 
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
Lowest common denominator. So all this .... Is useless because they do not exist in the lowest common denominator. It's really ironic that tech enthusiasts fail to grasp tech reality. I'm sorry but 90% of people will not buy 400$ or 600$ or 1200$ gpus to run rtx or dlss or any of this other cr@p in 3 damn games. And if you think they will then you are in a total disconnect from reality.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yes, but the people you're trying to enlighten don't necessarily know that. Otherwise, why mention training is better in the first place, eh?



It isn't. It's actually pretty hard because you don't know when you've actually gotten to a global optimum. At least not when training neural networks. You can easily weed out poor performers, but the same is not true once you get to a decent level of competence.

With respect to developing a tool to analyze artifacts, that's actually pretty hard too because artifacts are subjective. You're going to get artifacts regardless, which ones are actually acceptable?

If you had an accurate model that could easily allow you to identify a global optimum, you'd be able to build a mathematical model to achieve it and, thus, model it with a regular algorithm. You wouldn't need neural networks in the first place.



You've basically described all upscalers.



This doesn't even make sense. If I'm going to use an upscaler, why wouldn't I complain about quality if there's better upscalers out there?
Yeah, well, in this case it's pretty easy to spot local minimum: you look at the image (using your own eyes or a high-pass filter) and see whether anything sticks out.
 
Joined
Jun 27, 2016
Messages
290 (0.10/day)
System Name MacBook Pro 16"
Processor M1 Pro
Memory 16GB unified memory
Storage 1 TB
Too bad it's Death Stranding.. that game sucks ass
 

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Too bad it's Death Stranding.. that game sucks ass
The game itself doesn't matter much. If one game can do it, other can, too ;)

Edit: Oh, I see what's going on here. The game is actually pretty good, but we pretend it's bad because Nvidia has something to do with it. My bad.
 
Last edited:
D

Deleted member 185088

Guest
1070? I'm pretty sure DLSS doesn't even work with that.
Of course not, DLSS 2.0 is only available on RX20xx.
Sorry for that confusion, I mentioned the 1070 I used for 4k gaming and how powerless it was.
I've looked at screenshots of DLSS2.0 and it seems more like a sharpening filter.

I didn't even think you could get 24" 4k, let alone it be worth bothering with. You learn something new everyday.
They do exist and damn they are sharp, especially for text it's a pleasure to use them.
 
Joined
Jul 18, 2017
Messages
575 (0.23/day)
Nvidia is multiple generations ahead of everyone else. DLSS 3.0 is pretty much an apocalypse for non-Nvidia users.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
The amount of ignorance on this thread is truly astounding.
DLSS 2.0 is the absolute best image reconstruction technique in gaming, yet people are moaning "but it's not real 4K". Just educate yourself and appreciate the technology, you don't have to complain about every damn thing Nvidia-Related.
what a joke.
 
Last edited:
Joined
Mar 28, 2020
Messages
1,651 (1.10/day)
I like how most media sites are stroking Nvidia's ego with this. Arstechnica's Death Stranding for PC preview says it supports both DLSS 2.0 and AMD's FidelityFX upscaling tech. More importantly, it says that FidelityFX's upscaler preserves more detail and provides more of a boost. Most importantly, FidelityFX is vendor-neutral: their tests were done on Nvidia hardware. I especially like the fact that FidelityFX works with Pascal while DLSS 2.0 doesn't.

So... seems to me DLSS 2.0 remains overhyped. I'm sure Nvidia's marketing money will bury this, though.

I agree. I feel the need to have Tensor cores/ AI to perform this upscaling while theoretically sounds better, however in practice, even a dumb down version like the FidelityFX is capable of improving performance without significant loss in image quality. If this supposed upscaling by AI is significantly better in terms of image quality and ease of implementation, then DLSS 1.0 will not have failed in the first place. To me, it is the usual Nvidia strategy to bring in proprietary technology to retain their foothold.

The amount of ignorance on this thread is truly astounding.
DLSS 2.0 is the absolute best image reconstruction technique in gaming, yet people are moaning "but it's not real 4K". Just educate yourself and appreciate the technology, you don't have to complain about every damn thing Nvidia-Related.
what a joke.

Most people won't have that much to complain if,
1. This is not a proprietary technology, limited to only certain hardware
2. Because of this, you pay a premium to Nvidia, where competitor provides a dumb down version of an upscaler that works across hardware

Anyway regardless of the image quality, it is factual that it is not true 4K. That is the purpose of DLSS.
 
Joined
Sep 17, 2014
Messages
20,992 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Me: "Pffft, a walking simulator, who needs such a thing"
Covid-19 "Hi!"
Me: "Wow, it's like actually being outside!"

Hah but you never walked outside at TWO HUNDRED FORTY EF PEE ES

The tech press should not be spreading lies. 4K is 4K, and DLSS is not. You can claim your image looks as good, that's fine, I disagree, but say "it looks as good as 4k" use actual English that has meaning instead of marketing speak.

Many modern games use rendering techniques that look blurry and don't scale with resolution anymore. Play Detroit for example, and you'll notice that much of the image doesn't improve when you switch from 1440p to 4k. Play old games and it is totally different. That is why DLSS is being pushed because it is easy to fool people who have games like Detroit, Death Stranding, Control and FF15 that are chronically blurry.

I personally prefer the visuals of last gen. Games like Mass Effect 3 and old Unreal games before the AA craze where resolution actually leads to a proper image. The fetish for realistic graphics has led to transparency techniques for hair and vegetation that is blurry and horrible imo. I'd rather play Trials of Mana at 4k 120fps and enjoy crisp visuals than play most of the recent stuff we are getting. I even prefer the visuals of Fortnite over DLSS games.

It seems the market is diverging a bit. I love Riot Games commitment to fast performance and clean rendering techniques.

Inclined to agree. Some games are so blurry its painful. I even happen to complete kill AA altogether when only temporal is available. Yes, the jaggies are gone... with the sharpness. It gets even worse when there's a (fixed!) sharpening pass on top of that, suddenly you're looking at a cartoon. SOMETIMES, T(x)AA looks good. But whenever it does, it also hits perf pretty bad.

That said, DLSS at very high resolutions is a lot better than TAA most of the time. Definitely a middle ground preferable to newer AA in general.

Resident Evil (1, reboot) took the cake. It has an internal render resolution slider... even at max its STILL noticeably below your native display res. Like you're rendering 720p content on 1080p. And if that isn't enough... some games place a constant chromatic abberation horror on top of that. Want some LSD with your game?

I agree. I feel the need to have Tensor cores/ AI to perform this upscaling while theoretically sounds better, however in practice, even a dumb down version like the FidelityFX is capable of improving performance without significant loss in image quality. If this supposed upscaling by AI is significantly better in terms of image quality and ease of implementation, then DLSS 1.0 will not have failed in the first place. To me, it is the usual Nvidia strategy to bring in proprietary technology to retain their foothold.



Most people won't have that much to complain if,
1. This is not a proprietary technology, limited to only certain hardware
2. Because of this, you pay a premium to Nvidia, where competitor provides a dumb down version of an upscaler that works across hardware

Anyway regardless of the image quality, it is factual that it is not true 4K. That is the purpose of DLSS.

FidelityFX you say? God no. Its far worse, you're better off using some SweetFX filters from 2010. I'm not even joking.

I'll happily pay a premium for features that actually do improve the experience, and in the case of DLSS, there is virtually free performance on the table, so is it really a premium... some beg to differ.

Not that I jumped on Turing at this point, and never would've for DLSS alone either... but the tech is pretty impressive at this point. IF you can use it. That is the deal breaker up until now, but apparently they're working to make it game agnostic. And note... competitor only provides a dumbed down version because its cheap, its easy, and it somehow signifies 'they can do it too' when in reality they really can't. Let's call it what it is, okay?
 
Last edited:
Joined
Mar 21, 2016
Messages
2,203 (0.74/day)
The thing people don't get about AI, is the more you train it, the better it gets.
That's only true up to a certain threshold and within a certain threshold tolerance of performance required as well. The lower the resolution the less information you have to manipulate on the one hand the better perceived uplift within a performance threshold on the other hand the worse the quality uplift relative a higher base resolution to work with if the performance threshold also scaled to account for it in turn. Higher resolutions require more performance there is no getting around it without making image quality sacrifices for the sake of performance.

The amount of ignorance on this thread is truly astounding.
DLSS 2.0 is the absolute best image reconstruction technique in gaming, yet people are moaning "but it's not real 4K". Just educate yourself and appreciate the technology, you don't have to complain about every damn thing Nvidia-Related.
what a joke.
DLSS isn't bad and it's defiantly made some big relative improvements, but it's still certainly not the same as native 4K. It's a fancy technique of post process more algorithms combined with upscaling more than anything else nothing more nothing less. It's gotten better and more convincing at it and has it's perks especially in terms of performance. That said label it the same is unfair. The bigger problem is DLSS isn't just something you turn on and it works in all cases nope it needs developers to specifically enable it's usage. That is a key difference of importance. If it worked on every game all the time and equally well when enabled it would be pretty noteworthy, but that's clearly not the case. Nvidia has a lot of AA technique that are situational and require a special developer relationship to work in the first place and like those DLSS is a really useless piece of tech when it doesn't work in all those older or unsupported titles. I tend to prefer tech that just f*cking works all the time period myself. I have nothing against Nvidia or DLSS inherently, but I think tech that works across the board under all circumstances is far more beneficial and wider reaching. How much have we heard about Mantle lately!? I mean it sounded great closer to the metal better performance seems legit, but in reality you're dependent on the developer and when you're dependent upon them mostly only the AAA developers leverage the hardware to it's full extent in grand scheme.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I see VALVe, have invested some love into this title too.

Guess I'll have a crack at being Postman Pat too.

Beyond that, you have two choices when it comes to graphics cards, pick your poison and quit whining year in year out.
 
Joined
Aug 6, 2017
Messages
7,412 (3.00/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT

scroll down and compare native resolution+taa vs dlss 2.0 with a slider,then look at performance gains,near 40%.
this is crazy.the image is clearer and more detailed,and runs way faster.
dunno what amd fideltyfx is but it looks like a mess.

dlss2.jpg
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10

scroll down and compare native resolution+taa vs dlss 2.0 with a slider,then look at performance gains,near 40%.
this is crazy.the image is clearer and more detailed,and runs way faster.
dunno what amd fideltyfx is but it looks like a mess.

View attachment 162274
This is FidelityFX: https://gpuopen.com/fidelityfx-cas/
A sharpening filter mostly. It says is does up/downscaling as well, but it's unclear how it does that.
 
Joined
Aug 6, 2017
Messages
7,412 (3.00/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
This is FidelityFX: https://gpuopen.com/fidelityfx-cas/
A sharpening filter mostly. It says is does up/downscaling as well, but it's unclear how it does that.
looks like crap here

fidelityfx ................................................................................................... dlss 2.0
fidelity.jpg


lol,it's this amd version of nvidia's dlss that's so much better according to red team fanbase
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.

It's "better than 4K". (no jokes, some green brains claim verbatim 'better than original' upscaling)
 

bug

Joined
May 22, 2015
Messages
13,245 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
looks like crap here

fidelityfx ................................................................................................... dlss 2.0
View attachment 162277

lol,it's this amd version of nvidia's dlss that's so much better according to red team fanbase
It does look pretty good in their own screenshots. Give it time, DLSS wasn't great from the beginning either.
It's just that AMD doesn't talk about upscaling in FidelityFX. I'm pretty sure this is mostly a sharpening filter + some classic interpolation (e.g. bi/trilinear, nearest neighbor).

It's "better than 4K". (no jokes, some green brains claim verbatim 'better than original' upscaling)
Well, when someone claims "better than original", it's pretty clear they don't know what they're talking about. Hint: there's no "original" in CGI ;)
True 4k is still an approximation of a geometric model as seen through the rendering pipeline.
 
Last edited:
Top