• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DSR?

Joined
Aug 9, 2016
Messages
246 (0.08/day)
Location
Groton, CT
System Name Da PC Brah
Processor AMD Athlon X4 880k (OC'ed to 4.5 GHz)
Motherboard ASRock FM2A88X Extreme 6+
Cooling CRYORIG C7
Memory 2x4GB Kingston HyperX Fury Black DDR3-1866 MHz
Video Card(s) Sapphire Radeon RX 470 4GB
Storage Kingston SSDNow V300 120GB (OS and 3 games) Western Digital Caviar Blue 1TB (Mass Storage)
Display(s) Changhong 32" 1080p
Case Thermaltake RX-1 Overseer
Power Supply EVGA SuperNOVA G2 550W 80+ Gold
Can someone help me with my understanding of DSR I read the article on NVidias website but it still doesn't make much sense to me. Does your monitor need to support it or is it gonna work regardless? Because that might be a deciding factor on what I end up getting for a screen (or scrapping what I already currently have). Hm, it definitely is a conundrum for me.
 
Its like how super sampling AA used to work. You need lots of GPU power for a little increased IQ.
 
It's useless feature if you have a 120 or 144Hz monitor. It forces 99% of games to run at 60Hz even though you're physically capable of running them at a 144Hz refresh. Which makes gaming experience rubbish and the ownership of a speedy 144Hz screen pointless. Used it few times and while it creates pleasant image quality, especially for games where you can't enable edge smoothing in any other way (Wolfenstein New order for example), but the thing with refresh just ruins it entirely. I can't force myself to play games at 60Hz because it literally makes my eyes bleed.
 
It gives you a 'pretend' resolution higher than your monitor really supports.

Most of the time it lightly increases quality for games, but badly hurts small text quality. Only really handy for benchmarking imo.
 
I don't understand why supersampling is not a thing in these days. It works in ANY game regardless of rendering technique used, doesn't make text and GUI tiny and we have enough horsepower to use it. If anything at least for older games. And yet we are stuck with only FXAA which still doesn't work everywhere (OpenGL anyone?) and MSAA which hardly works anywhere these days.

I wonder if AMD's VSR has same problems with refresh rate as NVIDIA's DSR when you have 144Hz monitor...
 
Can someone help me with my understanding of DSR I read the article on NVidias website but it still doesn't make much sense to me. Does your monitor need to support it or is it gonna work regardless? Because that might be a deciding factor on what I end up getting for a screen (or scrapping what I already currently have). Hm, it definitely is a conundrum for me.

It's useless feature if you have a 120 or 144Hz monitor. It forces 99% of games to run at 60Hz even though you're physically capable of running them at a 144Hz refresh. Which makes gaming experience rubbish and the ownership of a speedy 144Hz screen pointless. Used it few times and while it creates pleasant image quality, especially for games where you can't enable edge smoothing in any other way (Wolfenstein New order for example), but the thing with refresh just ruins it entirely. I can't force myself to play games at 60Hz because it literally makes my eyes bleed.

People, I think it's pretty useful as a test function. On a 1080p display, the graphics card internally is running at 4K, but has to downsample the image to 1080p. This gives you what looks like a 4K desktop, but with none of the clarity. It can also function as a sort of smoothing for games, but I thought that regular AA looked better as well as performed better.

No, it's main benefit is for checking how well your graphics card does at 4K for any particular game, as performance drops considerably with 4 times the number of pixels to render. This can help you to decide on whether you want to get a better card or not, along with a 4K display without actually having to buy that expensive monitor first.
 
regular AA looked better as well as performed better.

Not always. Like GTA5 for example, the distant shadows and shimmering is way less using DSR. Also some moving objects like windmill, the heck the metal structures, mesh looks normal like in MGSV using DSR. Also playing old UT based games, like Mass Effect that doesn't have AA as such.

Never had a FPS limit using DSR. Something strange.

Adjust sharpness in settings, I prefer using less around 15.
 
Not always. Like GTA5 for example, the distant shadows and shimmering is way less using DSR. Also some moving objects like windmill, the heck the metal structures, mesh looks normal like in MGSV using DSR. Also playing old UT based games, like Mass Effect that doesn't have AA as such.

Never had a FPS limit using DSR. Something strange.

Adjust sharpness in settings, I prefer using less around 15.
I'm sure one or the other could look better depending on the game and quality settings. I just did a little casual check on a handful of games and noticed that AA looked better, that's all.

I'm not sure what you mean by an FPS limit? I'm just saying the performance dropped because the card had 4 times as many pixels to push around.

I found that the default 33% sharpness setting looked best to me when showing the desktop.
 
Does your monitor need to support it or is it gonna work regardless?

No. DSR is hardware-agnostic when it comes to monitors.

When you enable DSR for a particular game the NVIDIA driver essentially "tricks" the game into believing that your monitor supports higher resolutions (the DSR resolutions you allow). When you run the game, you have to go into its options and select one of these DSR resolutions for DSR to actually have any effect.

Concrete example: You have a 1920x1080 monitor, enable DSR resolution of 3840x2160 (4k) for one of your games, and set that game to use that resolution. When you launch that game, the graphics card will render the frames at 3840x2160, then shrink them back down to 1920x1080 when it sends them to the monitor. This is similar to, but NOT the same as, how supersampled anti-aliasing (SSAA) works. For more info I'd recommend reading this article: http://techreport.com/review/27102/maxwell-dynamic-super-resolution-explored

Basically, DSR may or may not be faster than SSAA, and may or may not look better. However, the fact that the NVIDIA control panel allows you to specify the smoothness of DSR gives you more control over how the final image looks than you would have with SSAA. Personally I've found that running UT3 at 3840x2160 on my 2560x1440 monitor performs and looks better than any antialiasing options I've tried.
 
I'm sure one or the other could look better depending on the game and quality settings. I just did a little casual check on a handful of games and noticed that AA looked better, that's all.

FPS limit was referred to Rejzors issue.

I am at work, I have screenshots of Witcher and MGSV at home, I did a test looking if it is really better. The mesh structures in MGSV gained most fidelity.

If the thread won't die might post some game pr0n.
 
Last edited:
Please do, very interested to see it in action ...

Here some fast ones.

20160812233456_1.jpg 20160812233527_1.jpg
 
Last edited:
Its like how super sampling AA used to work. You need lots of GPU power for a little increased IQ.

All so higher electrical usage too, compared to using AA.

EDIT: And can give a good idea if you could run a higher resolution monitor with your games too.
 
Last edited:
It gives you a 'pretend' resolution higher than your monitor really supports.

Most of the time it lightly increases quality for games, but badly hurts small text quality. Only really handy for benchmarking imo.
Why would this help in benchmarking? The point is to get the most FPS, not make it look pretty. ;)
 
Why would this help in benchmarking? The point is to get the most FPS, not make it look pretty. ;)
Bench higher resolutions.
 
No. DSR is hardware-agnostic when it comes to monitors.

When you enable DSR for a particular game the NVIDIA driver essentially "tricks" the game into believing that your monitor supports higher resolutions (the DSR resolutions you allow). When you run the game, you have to go into its options and select one of these DSR resolutions for DSR to actually have any effect.

Concrete example: You have a 1920x1080 monitor, enable DSR resolution of 3840x2160 (4k) for one of your games, and set that game to use that resolution. When you launch that game, the graphics card will render the frames at 3840x2160, then shrink them back down to 1920x1080 when it sends them to the monitor. This is similar to, but NOT the same as, how supersampled anti-aliasing (SSAA) works. For more info I'd recommend reading this article: http://techreport.com/review/27102/maxwell-dynamic-super-resolution-explored

Basically, DSR may or may not be faster than SSAA, and may or may not look better. However, the fact that the NVIDIA control panel allows you to specify the smoothness of DSR gives you more control over how the final image looks than you would have with SSAA. Personally I've found that running UT3 at 3840x2160 on my 2560x1440 monitor performs and looks better than any antialiasing options I've tried.
What a great answer. :) Why don't you try 4x DSR on your 2560x1440 monitor and let us know how it looks and performs? That's a 5120 x 2880 frame buffer. Unfortunately I've only got a 1080p monitor so can't do this.

I'd love to try out DSR on a 4K monitor. I don't know if current gen cards can even support this - 7680 x 4320. Just imagine how tiny all the desktop icons would look!
 
AMD's version aka "VSR" is FAR better IMO, the only thing i changed was the video card from a 7870Ghz, or 7970Ghz, to a GTX 970, and I tried Both brands version, and the AMD version was WAYYYY better.thats my experience, i cant say weather or not it is the better interface of the AMD settings, or the performance, or implementation, but the end result was better with AMD for me.
 
Why would this help in benchmarking? The point is to get the most FPS, not make it look pretty. ;)

to benchmark at higher res than your monitor can do. Like many have already said, testing for 2K/4K from a 1080p screen.
 
Thanks bruhs!
 
DSR makes games look like utter crap most of the time in my experience.
 
AMD's version aka "VSR" is FAR better IMO, the only thing i changed was the video card from a 7870Ghz, or 7970Ghz, to a GTX 970, and I tried Both brands version, and the AMD version was WAYYYY better.thats my experience, i cant say weather or not it is the better interface of the AMD settings, or the performance, or implementation, but the end result was better with AMD for me.
That's interesting and I'd like to compare for myself and play around with it, but unfortunately, I only have very old 2000 series AMD cards which don't support it. Is it just clearer, perhaps? If so, that'll be a better downsampling algorithm.
 
The best one is gedosato still... everyone can try it on dx9 titles.
 
AMD's version aka "VSR" is FAR better IMO, the only thing i changed was the video card from a 7870Ghz, or 7970Ghz, to a GTX 970, and I tried Both brands version, and the AMD version was WAYYYY better.thats my experience, i cant say weather or not it is the better interface of the AMD settings, or the performance, or implementation, but the end result was better with AMD for me.

Do you have any info regarding refresh rate with VSR ? Does it always support actual max native screen refresh (aka 120 or 144Hz) regardless of the processing resolution or is it limited to the resolution's refresh it's processing the image at? In case of 4K, that would be only 60Hz.

The stupid thing with DSR is that if I force desktop to 4K with 144Hz, games will also run at that no problem. But if I let DSR kick in via game, it sticks to stupid 60Hz. Talked about it on NVIDIA forums ages ago, received zero replies and nothing changed. Lame as hell. I'm not in the mood of forcing desktop resolution to 4K 144Hz every time I want to play a game.
 
Why are you all trying to change desktop resolution?

I only up the resolution ingame.
 
That's interesting and I'd like to compare for myself and play around with it, but unfortunately, I only have very old 2000 series AMD cards which don't support it. Is it just clearer, perhaps? If so, that'll be a better downsampling algorithm.

the interface was better on AMD's side IMO, and the image was as well.
again, i cant say why one would be better,

@RejZoR

sorry, my monitor is a shitty old 1200p @ 60Hz :(

but i have heard of the refrsh issues before, its too bad.
 
Back
Top