• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 2.5.1

Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Yeah, I agree the Witcher 3 looks better but it's game dependant. But it's still nothing similar to a 4K image. When I played CP2077 first time around, I think the DLSS was super early implementation. Never really noticed it that much, but I also had RT on lower settings (or it did it anyway because of DLSS?). Anyway, I had bought a Gsync monitor so playing single player fps didnt ned high frames, 40-45 fps was smooth enough for me.

DLSS has evolved to a point where a newer version like 2.5.1 will provide the best IQ across all DLSS2 games, instead of the stock DLL that shipped with the game. CP2077 originally shipped with version 2.1 and it didn't even have negative mipmap (which Nvidia advised should be used with DLSS).

With the latest DLSS 2.5.1, you can try notch lower DLSS setting (like Quality --> Balanced) and the resulting IQ is comparable to older version at higher setting, sure you can play game at 40-45FPS but getting 10-15% higher FPS without any noticeable IQ loss would be nice.

I just spent like 10h playing Witcher 3 on my old rig with 2080Ti at 3440x1440 DLSS Performance and it looks good with RT, play nice enough at 40-45FPS.
 

Oshim

New Member
Joined
Oct 18, 2022
Messages
4 (0.01/day)
WOW... This new version seems pretty sh*t :( And I am not even comparing it to the native resolution, just the 2.4.3 version...

Way to go NVIDIA!
 
Joined
Dec 25, 2020
Messages
4,444 (3.73/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1F
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA + Logitech G840 XL K/DA
Keyboard Logitech G Pro TKL K/DA
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Ultra performance IS the 720p option

Its the lowest of the low, so at least they've improved upon it


When your 3050 laptop GPU is 5 years old, you'll be grateful it has DLSS Ultra performance

In newer games usage of DLSS or similar technology is already required to conserve VRAM as it has only 4 GB or so... otherwise I really enjoy the 3050 mobile, rather solid little GPU if you ask me.
 
Joined
Dec 12, 2012
Messages
711 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
GPU power consumption during Witcher 3 RT is significantly lower than CP2077. on my GPU Witcher 3RT uses 320-350 watts, where CP2077 uses 420-450 watts.

Even in these videos you can clearly see the difference in GPU power consumption. CP2077 consuming 230 to 250watts, and Witcher 3RT consuming 180-200 watts.

Lower performance in Witcher 3RT is not from being "more GPU demanding", but due to being a badly optimized mess that doesn't use the GPU completely, and then runs horribly because of it.

This often happens when a game is not built around DX12/Vulkan from the beginning. The Witcher ends up being CPU-bound in DX12, which is the exact opposite of what this API is supposed to do. And it is not because of ray-tracing, as even with all RT features off, the game runs much slower compared to DX11.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
DLSS Ultra Finewine :D
Isn't it just perhaps a little bit amusing, over the past two-ish years Nvidia RTX cards have certainly gotten 'finewine' drivers and additional software betterments, like DLSS improvements. Not only that, but instead of feeling like something was left on the table at launch, like buggy AMD cards that don't live up to the hype, we got all the juicy good stuff and performance at launch, and then it became yet better.
Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf
without a doubt in my mind. DLSS 2.5.1 Ultra Performance mode at 4k, looks closer to 4k than it does native 720p. Don't believe me? go try it, 720p is mud compared to DLSS UP mode on 4k. Will be funny to invariably in some other thread of course, hear how that can't be true and it must be worse because someone saw some screenshots somewhere, fresh from the mouth of someone who doesn't own RTX hardware. You may think that sounds so daft it's impossible, but it happens.
 
Joined
Oct 12, 2005
Messages
681 (0.10/day)
DLSS Ultra Finewine :D

Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf

W3 4K DLSS UP vs 1080p TAA


Quite sad that so much games don't have FSR2.0/DLSS 2.x

If by default, it was in every game, 4K monitor would be the way to go and you would just use the upscale you need to get correct perf. You would get better image quality with a 1080p hardware than with a 1080p monitor.

But it's far from being the case sadly. And still plenty of game get released without those feature so it's not going to happen anytime soon.
 
Joined
Jan 12, 2023
Messages
177 (0.40/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base 5.0Ghz boost, -25 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Corsair RGB Harpoon PRO Wired Mouse
Keyboard Corsair K60 RGB PRO Mechanical Gaming Keyboard
Software Windows 11 Professional
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).

The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Please use PNG or RAW instead of JPG for comparison shots, if possible. The images are overly compressed.:toast:
 
Joined
Aug 9, 2019
Messages
1,512 (0.89/day)
Processor Ryzen 5600X@4.85 CO
Motherboard Gigabyte B550m S2H
Cooling BeQuiet Dark Rock Slim
Memory Patriot Viper 4400cl19 2x8@4000cl16 tight subs
Video Card(s) Asus 3060ti TUF OC
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores CB20 4710@4.7GHz Aida64 50.4ns 4.8GHz+4000cl15 tuned ram SOTTR 1080p low 263fps avg CPU game
I`m impressed! Even at 1080p ultra performance is now actually usable vs DLSS 2.4 where it was unusable. Good news for those sitting on 3050 or 2060-cards :)
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Quite sad that so much games don't have FSR2.0/DLSS 2.x

If by default, it was in every game, 4K monitor would be the way to go and you would just use the upscale you need to get correct perf. You would get better image quality with a 1080p hardware than with a 1080p monitor.

But it's far from being the case sadly. And still plenty of game get released without those feature so it's not going to happen anytime soon.
An image or a game on a 1080p native monitor is way sharper and clear than a 4K native with 1080p scaled up.
 
Joined
Jun 27, 2016
Messages
6 (0.00/day)
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).

The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
Not quite. You don't set the game rendering resolution - the DLSS preset will do that for you.

DLSS presets line up with a render scale percentage of your native resolution setting.

Quality = 66.6%
Balanced = 58.0%
Performance = 50.00%
Ultra Performance = 33.3%

For example, if your in-game resolution is set to 1440p, enabling DLSS Quality within the game will have it internally render at 960p then smart upscale using data fed by the game back to the native resolution. And yes, it allows for things like 4K rendering at the performance of 1080p (which is equivalent to DLSS Performance mode).
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).
Don't change the resolution to anything lower than native, DLSS does the rest for you by choosing a preset.
The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
That's about the gist of it.

Use your monitors native res, and choose a DLSS setting that has the right balance of visual fidelity and frames per second, to your preference.
 
Joined
Jan 13, 2020
Messages
25 (0.02/day)
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168


Because ultra performance at 4K looks much better than 1080p. Has less jagged and shimmering and smothers lines

You only focused on lights.... But look at jagged edges on the window and blurry image. I say ultra performance look better in most aspects. 1080p on 4K display looks blurry

And also ultra performance at 4K has higher fps than 1080p

Personally, I would not recommend going below performance mode but in some rare cases ultra performance mode can be useful (for example if you have RTX 2060 and 4K display)
 
Last edited:
Joined
Dec 25, 2020
Messages
4,444 (3.73/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1F
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA + Logitech G840 XL K/DA
Keyboard Logitech G Pro TKL K/DA
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Because ultra performance at 4K looks much better than 1080p. Has less jagged and shimmering and smothers lines

You only focused on lights.... But look at jagged edges on the window and blurry image. I say ultra performance look better in most aspects. 1080p on 4K display looks blurry

And also ultra performance at 4K has higher fps than 1080p

Personally, I would not recommend going below performance mode but in some rare cases ultra performance mode can be useful (for example if you have RTX 2060 and 4K display)

At 4K it targets an enhanced 720p image. I'd still say that to achieve 1080p level of fidelity you will want to use performance, if not balanced mode. Ultra performance is intended to be used at 8K resolution, or at least it was back when DLSS 2 was first announced alongside the RTX 3090.

I still remember the marketing fluff:

1674693839026.png



Fast forward two years, it's barely fast enough to get Forspoken playable on ultra high targeting a smooth and consistent 1080p 60 Hz without the blasted DLSS :nutkick:
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Fast forward two years, it's barely fast enough to get Forspoken playable on ultra high targeting a smooth and consistent 1080p 60 Hz
Turns out, that's pretty much forspokens fault, not DLSS's fault.

4k using DLSS uktra performance looks better than 1080p scaled to a 4k panel, not a doubt in my mind, mostly because over the last few days I've dropped the 2.5.1 DLL in every dlss games folder I can and the results are fantastic.

The best way I can think to describe it, is using DLSS ultra perf mode on my 42" 4k OLED, looks strikingly similar in sharpness/softness (depending which way you slice it) to 1080p native on a 27" display, or 2560x1080 @ 34/35", while providing a larger more rich image overall. It's not the pinnacle of image quality, neither are, but it certainly looks closer to 4k than it does 720p.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
DLSS 2.5 just destroyed that lightboard on the first Cyberpunk image. Whilst you can tell it clears up thin line stuff pretty well, it looks too aggressive.
The lightboard looks worse, but several other stuff looks better, including the concrete below the lightboard.

DLSS can make some stuff look worse, and other stuff look better, than native.

I have seen that in numerous games. Improvements compared to native. Sharper and better textures. Less jaggy lines etc.
 
Joined
May 26, 2021
Messages
120 (0.12/day)
This often happens when a game is not built around DX12/Vulkan from the beginning. The Witcher ends up being CPU-bound in DX12, which is the exact opposite of what this API is supposed to do. And it is not because of ray-tracing, as even with all RT features off, the game runs much slower compared to DX11.
Precisely. Which is why I wished to comment on that “heavy game” statement that TPU made. Completely unfounded.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
An image or a game on a 1080p native monitor is way sharper and clear than a 4K native with 1080p scaled up.
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.
 
Joined
Dec 12, 2012
Messages
711 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.

I think they meant regular upscaling from 1080p to 4K. That definitely looks blurry, and it will look better on a native 1080p screen of the same size.

My friend used to have a 55" 1080p TV and it looked great from a normal viewing distance.

With my LG OLED, I actually prefer when the TV scales from 1080p than when consoles or the PC GPU (regular upscale) do it. It looks sharper.

On PC you can use integer scaling for 1080p which looks insanely sharp. Text looks a bit pixelated, but actual games look awesome, especially with TAA. When I had a 2070 Super, I played The Outer Worlds in 1080p with integer scaling and it actually looked better than 1440p upscaled to 4K by the GPU (which still looks good on a TV).

But 4K with DLSS Performance is definitely the most detailed image you can get in terms of upscaling. There are some drawbacks in certain games, but overall the technology is amazing.
 
Joined
Feb 1, 2019
Messages
2,521 (1.34/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
CD Projekt almost owned by Nvidia at this point. :)
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.
I didn't mean DLSS, just normal upscaling done by monitor, or GPU.
 

wolf

Performance Enthusiast
Joined
May 7, 2007
Messages
7,726 (1.25/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I didn't mean DLSS, just normal upscaling done by monitor, or GPU.
Been a long time since I tested that, you're likely correct. FSR or DLSS however, very different story, especially given you much they can clean up from a 4k render with meh TAA
 
Top