• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

DSR?

Why are you all trying to change desktop resolution?

I only up the resolution ingame.

Because of the refresh. If I up the desktop, I can have 4K at 144Hz. If I try doing it in a game, I'll only get 60Hz. Which sucks balls. That's why DSR is crap.
 
Can you disable ingame vsync, use adaptive and force to your panel rez?
 
It doesn't matter, if I set it to 4K, it's defaulting to 60Hz which sucks. Very few games support refresh control in game where you can adjust it. If the game also doesn't default to 60Hz because you've selected 4K resolution and it assumes one can only have 60Hz at that resolution. Which sucks again.

I'd use DSR a lot more, especially with older games if this crap wasn't acting up. But at 60Hz, it just feels wrong on my eyes and I can't stand it. I always play a game, something feels weird and when I check refresh rate, it's always at 60Hz.

If AMD has this solved better, I'd go with their cards, but so far no one managed to confirm this for me. Would be nice if someone with RX480 or R9 Fury and 1080p 120/144 Hz screen could verify this for me. I was all hyped for DSR and then this disappointment...
 
Rejzor, I have to use 5K on my setup, and I have a normal 75Hz refresh rate.

I played through American Mcgees Alice recently on DSR.
 
Last edited:
75Hz maybe, but for 144Hz it's a no go in pretty much every single game. I've started playing Wolfeinstein New Order and because I couldn't enable damn anti-aliasing (not even FXAA) I tried DSR. Which worked and the game looked pretty good. But again, restricted to 60Hz. First game I've spotted this nonsense was The Witcher, original one. Great image, stuck with shitty 60Hz. Argh.

The thing is, image scaling is done internally, but the graphic card thinks it's sending 4K through output and automatically limits games to 60Hz because of it, even though in reality, it's still sending out just 1080p signal, meaning it should have 144Hz output, not 60Hz.
 
Wolfenstein is known to be hard locked to 60FPS, case closed.
 
Wolfenstein is known to be hard locked to 60FPS, case closed.

Framerate has NOTHING to do with refresh. You can still run 60fps locked game at 144Hz...
 
Framerate has NOTHING to do with refresh. You can still run 60fps locked game at 144Hz...

Uh, you can, but honestly what you just said was quite ignorant and nearly sig worthy...

It'll tear like shit because of it running at 60 FPS while redrawing the screen 144 times per second (that's what "hz" are), which is a nonmultiple to the fps.

You basically just described turning off vsync. Hello pointless screen tearing for no gain. You may as well run it at 60hz and call it a day.
 
Um lol, no. You get tearing when framerate goes ABOVE refresh. That's why you need V-Sync, which limits the framerate to refresh. It's why Adaptive V-Sync auto disables itself when framerate drops below refresh. Because there is no real need to use it.

There is some micro tearing if there are flickering lights in a game, but other than that, no noticeable tearing.
 
Um lol, no. You get tearing when framerate goes ABOVE refresh. That's why you need V-Sync, which limits the framerate to refresh. It's why Adaptive V-Sync auto disables itself when framerate drops below refresh. Because there is no real need to use it.

There is some micro tearing if there are flickering lights in a game, but other than that, no noticeable tearing.

dont forget just capping your FPS. I run 60Hz, Vsync off, capped at 59 FPS. No Vsync issues, no input lag, no tearing :D
 
Not always. If you try that in CS:GO, you'll get horrendous image tearing. Far worse than just leaving it uncapped.
 
Um lol, no. You get tearing when framerate goes ABOVE refresh.
You get tearing either above or below refresh, not just above. Only way to stop it is to use vsync.

It does tend to be less noticeable when it's under though, which is probably why you think this.
 
It's not what I believe, it's how it works. Why do you think NVIDIA designed Adaptive V-Sync in a way that it turns off when framerate below refresh? If it was useless, why bother then, right? Like I've said, tearing is the worst when framerate is higher than refresh. Because then you're skipping frames in between refresh sync and it tears like mad. If refresh is higher, you're just re-scanning same identical frame twice. It may slightly misalign, but far less than when there is a skip and screen tries to stitch two very different frames together. It creates that tear. Very simplified said, but that's why tearing happens.
 
It's not what I believe, it's how it works. Why do you think NVIDIA designed Adaptive V-Sync in a way that it turns off when framerate below refresh?

Yes, exactly, this is how it works, and should work, to kill latency.
 
It's not what I believe, it's how it works. Why do you think NVIDIA designed Adaptive V-Sync in a way that it turns off when framerate below refresh? If it was useless, why bother then, right? Like I've said, tearing is the worst when framerate is higher than refresh. Because then you're skipping frames in between refresh sync and it tears like mad. If refresh is higher, you're just re-scanning same identical frame twice. It may slightly misalign, but far less than when there is a skip and screen tries to stitch two very different frames together. It creates that tear. Very simplified said, but that's why tearing happens.
Not sure of you're trying to negate what I've said or not, but you haven't. This is simply a more detailed description and I've said myself that it's generally more noticeable when the graphics card is rendering faster than the refresh.

I do remember playing UT2004 years ago with a low end card that couldn't maintain 60fps and that teared something awful. Mix that with the judder and it was truly terrible, It was an FX5200 or GF6200, forget which now.

btw, I've also noticed that if the fps is really high, like 250-350fps, syncing barely matters and the animation generally looks smooth, with the edges of the moving object pretty much holding together. Having the monitor refresh really high, like 144Hz helps further. The tearing still happens, but it must all be too quick and fleeting for the eye to see most of the time I guess.
 
I've only experienced horrible image tearing like image was being shredded in Bioshock 2 with MLAA enabled when it first came out on Radeons. But that was just a bug that soon got fixed.
 
Back
Top