• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

FreeSync / Gsync flicker

Joined
Jan 14, 2019
Messages
16,105 (6.93/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling Noctua NH-D9L chromax.black
Memory 2x 24 GB Corsair Vengeance DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Hi,

I bought a new monitor not long ago. It's a Dell S3422DWG 34" 3440x1440 144 Hz VA ultrawide.

It's an absolutely amazing monitor, but I've noticed that I occasionally get some weird backlight flicker in some games. After looking into it, I found this:


Basically, if you have a Variable Refresh Rate display with Low Framerate Compensation (LFC), and your in-game framerate comes close to, or drops below the display's VRR range (48 Hz in my case), then LFC kicks in, doubles, or triples your refresh rate. So, if you have 50 or 55 FPS, for example, the monitor will switch to 100 or 110 Hz momentarily to stay within VRR range and avoid screen tearing. This comes with blacklight flicker on some displays (mine included), which is extremely annoying.

The question is, is it possible to disable LFC without disabling FreeSync somehow? I generally have no issues playing at 48 FPS, but the flickering makes it really bad. :(

Another question is, what else can I do to mitigate backlight flicker with LFC?

I would like to avoid using CRU and any kind of custom things that aren't part of my display's official support if possible.
 
I've got a 1440p Dell 165Hz G-Sync monitor, don't have this.

Hope you can fix it somehow without buying a faster GPU...
 
I see Freesync as a solution in need of a problem because I haven't seen chop in games in decades with vsync off. I think Nvidia might have already implemented some type of frame delivery into their drivers to stop it from happening before Freesync even came out. That and altering monitor refresh rate on the fly alters mouse sensitivity on the fly making it useless. All new technologies like this seem like they're designed by people who don't even play video games.

If it actually provided some type of benefit with no drawback I'd leave it on, but cursor movement is less direct and more floaty for me while bringing little to no smoothness benefit. Displayport also is a little smoother than HDMI for me (think HDMI is more on the fly transmit while I think displayport is more bundled packetization), but I still use HDMI because it controls better while displayport sensitivity feels more dead.
 
I've got a 1440p Dell 165Hz G-Sync monitor, don't have this.

Hope you can fix it somehow without buying a faster GPU...
Thanks. :)

A faster GPU is an endless money pit because there's always a new game that needs a faster GPU. :(

I'd be happy to play with any frame rate within my FreeSync range (48-144 Hz/FPS) if I didn't have this flicker.

One solution is disabling FreeSync, but I'd like to avoid it if possible.

I see Freesync as a solution in need of a problem because I haven't seen chop in games in decades with vsync off. I think Nvidia might have already implemented some type of frame delivery into their drivers to stop it from happening before Freesync even came out. That and altering monitor refresh rate on the fly alters mouse sensitivity on the fly making it useless. All new technologies like this seem like they're designed by people who don't even play video games.

If it actually provided some type of benefit with no drawback I'd leave it on, but cursor movement is less direct and more floaty for me while bringing little to no smoothness benefit. Displayport also is a little smoother than HDMI for me (think HDMI is more on the fly transmit while I think displayport is more bundled packetization), but I still use HDMI because it controls better while displayport sensitivity feels more dead.
Interesting take on the matter.

I think FreeSync is great, because it avoids screen tearing within its range, as well as the low input response that you have with traditional V-sync.

Disabling it and playing with a constant 144 Hz is certainly a solution, just not the solution I'd prefer.
 
A faster GPU is an endless money pit because there's always a new game that needs a faster GPU. :(

I fell for this meme and tried the resource hog game Returnal. First thoughts were, well, the gunplay is azz and this is clearly a console trash game, but maybe the story will make up for it. That part didn't work out either. I'm not sure if there are actually any good resource hog games that exist. Same phenomenon with Dying Light 1 vs 2 players that seem to like version 1 better than 2. Can't think of a promising 'next gen graphics' title on the horizon besides maybe Stalker 2.

Haven't tried Starfield yet, but I've never liked any previous Bethesda games. Games like Oblivion and Skyrim were just big, empty worlds to me with meh story compared to a real RPG and mediocre combat. I guess that leaves only Cybermeme2077 as a potential good 'next gen graphics' game.
 
Last edited:
I fell for this meme and tried the resource hog game Returnal. First thoughts were, well, the gunplay is azz and this is clearly a console trash game, but maybe the story will make up for it. That part didn't work out either. I'm not sure if there are actually any good resource hog games that exist. Same phenomenon with Dying Light 1 vs 2 players that seem to like version 1 better than 2. Can't think of a promising 'next gen graphics' title on the horizon besides maybe Stalker 2.
Sure, but you can encounter resource-heavier areas in older games, too. I'm currently playing Kingdom Come: Deliverance, which is a 2017 title (I think), and while it plays well 98% of the time, there are a few population-dense areas in the game that can bring my framerate down to the low 60s - high 50s, which is more than fine for me, but it triggers the aforementioned LFC flickering, which is not fine.
 
Sure, but you can encounter resource-heavier areas in older games, too. I'm currently playing Kingdom Come: Deliverance, which is a 2017 title (I think), and while it plays well 98% of the time, there are a few population-dense areas in the game that can bring my framerate down to the low 60s - high 50s, which is more than fine for me, but it triggers the aforementioned LFC flickering, which is not fine.
Ah, Cryengine. I used to be a big Cryengine fan and wondered why nobody used it anymore. I went back to the Crysis series recently and figured out why. The gunplay on UE4 games seems to have beaten it, and all the new post-processing, lag fest features that they started to implement in Crysis seem to weigh down UE4 less than Cryengine (input lag-wise).

By the time you get to Crysis 3, even though my GPU is vastly better than what's needed, I'm forced to play the game on low settings because only on low does it turn off 10+ post-processing effects to make the cursor not feel like you're dragging around a boat anchor. The game is literally unplayable on high even with a 2023 GPU if you value responsive controls. Crysis 2 didn't suffer from this phenomenon nearly as bad, though.
 
Ah, Cryengine. I used to be a big Cryengine fan and wondered why nobody used it anymore. I went back to the Crysis series recently and figured out why. The gunplay on UE4 games seems to have beaten it, and all the new post-processing, lag fest features that they started to implement in Crysis seem to weigh down UE4 less than Cryengine (input lag-wise).

By the time you get to Crysis 3, even though my GPU is vastly better than what's needed, I'm forced to play the game on low settings because only on low does it turn off 10+ post-processing effects to make the cursor not feel like you're dragging around a boat anchor. The game is literally unplayable on high even with a 2023 GPU if you value responsive controls. Crysis 2 didn't suffer from this phenomenon nearly as bad, though.
I don't disagree, but it's not a solution to the problem, unfortunately.

Like I said, the game runs fine, but my monitor goes into some weird flicker at the lower range of the FreeSync spectrum. The problem is not the low frame rate. The problem is the display flickering.
 
I've got a 1440p Dell 165Hz G-Sync monitor, don't have this.

Hope you can fix it somehow without buying a faster GPU...
It this specific display that has this issue, which was mentioned in multiple reviews out there.

If you're within the return window just return it and save yourself the trouble.
 
It this specific display that has this issue, which was mentioned in multiple reviews out there.

If you're within the return window just return it and save yourself the trouble.
I'd like to find a solution to keep it if possible. The flicker is only an issue in some rare instances, otherwise, this is a brilliant monitor.
 
I'd like to find a solution to keep it if possible. The flicker is only an issue in some rare instances, otherwise, this is a brilliant monitor.
At least you know if all else fails you can just get a larger GPU. I don't find sub-60 FPS gaming enjoyable myself, so I'd probably already be considering that option anyway if I was you. I see founder edition 3080's on Ebay for $400 and pristine 3070 Ti's that weren't used for mining for $350.
 
Last edited:
I don't find sub-60 FPS gaming enjoyable myself
But I do, that's the thing. Anything above 30-45 FPS (depending on the game) looks acceptable to me. If I'm within FreeSync range (48-144 FPS), I'm good. I don't need LFC to kick in at 50.
 
Kingdom Come plays kind of clunky at all refresh rates, so no big loss there. Wait till you load up something like RE4. It's one of those games that plays horrendeous at low framerates and you'll probably be rethinking that stance.
 
can CRU help? my Acer 27" 144hz needs freesync range adjusted from default to say 38-140hz to make flickering go away, its a common issue with my lcd so it may apply to your dell with some google fu

how to eliminate flickering on gsync / freesync lcds:
 
Possibly. I'll have a go at it a bit later. :) Thanks.

@FoulOnWhite - How about your Dell? Have you experienced anything similar?

I have had no flicker that i can see with my display. I set windows to native at 144hz, and usually have games the same with vsync on. I have not noticed any flicker on games, even running CP2077 with rt on(ran at arond 60 fps lol) and did not notice any flickering. Why would i lie, if i had i would be complaining or trying to figure out how to fix it.

What's the default range on this display? any reason to change it?
 
I have had no flicker that i can see with my display. I set windows to native at 144hz, and usually have games the same with vsync on. I have not noticed any flicker on games, even running CP2077 with rt on(ran at arond 60 fps lol) and did not notice any flickering. Why would i lie, if i had i would be complaining or trying to figure out how to fix it.

What's the default range on this display? any reason to change it?
Hm... maybe because you have V-sync on? Do you use FreeSync?

The range reported by the display is 48-144 Hz. There is zero flicker on the Windows desktop, or when the game runs above 50-ish FPS, which it mostly does. I've also just tried W40k Boltgun which runs at a constant 144 FPS with a limiter (obviously, as it's a pixel art game), no flicker there, either. I only see it in some very isolated cases. That's why I don't want to RMA it. It's an awesome monitor overall.
 
Hm... maybe because you have V-sync on? Do you use FreeSync?

The range reported by the display is 48-144 Hz. There is zero flicker on the Windows desktop, or when the game runs above 50-ish FPS, which it mostly does. I've also just tried W40k Boltgun which runs at a constant 144 FPS with a limiter (obviously, as it's a pixel art game), no flicker there, either. I only see it in some very isolated cases. That's why I don't want to RMA it. It's an awesome monitor overall.

My GPU is Nvidia so don't think freesync works? I like this monitor too, imo can't go wrong with a Dell display. Also not really noticed any VA panel smear either, do i need to look for it to notice it?

I got mine for £275 so not gonna get a better display for near that price am i. My previous was a really nice Dell 27" 1440p 165hz Nano LG panel, but wanted to go UW.

Old monitor
https://www.amazon.co.uk/Dell-S2721DGFA-Compatible-DisplayPort-Adjustable/dp/B08NFBKNMY
 
I've also just tried W40k Boltgun
Another Boltgun victim. It's one of those games that seems like it's cool the first 5 minutes, then you notice it's almost identical to playing DOS Doom in terms of map design, weapons, gameplay, etc. Doom is actually probably better. Stuff like Roboquest, Deadlink, and even Unity Gunfire Reborn are all way better, not to mention things like the Borderlands games.
 
I bought a new monitor not long ago. It's a Dell S3422DWG 34" 3440x1440 144 Hz VA ultrawide.
I had flickering issues and black screens, so the first thing I tried, was replacing my cheap 1.4 DisplayPort cable with a 2.1 and all of the issues stopped. Also, your dell only supports Free-Sync on the Display Port by the looks of the specs.

From the Dell page....
Native Resolution
WQHD 3440 x 1440 (DisplayPort: 144 Hz, HDMI: 100 Hz)

Link: Dell 34 Inch WQHD Curved LCD Gaming Monitor - S3422DWG | Dell USA

If it actually provided some type of benefit with no drawback I'd leave it on, but cursor movement is less direct and more floaty for me while bringing little to no smoothness benefit. Displayport also is a little smoother than HDMI for me (think HDMI is more on the fly transmit while I think displayport is more bundled packetization), but I still use HDMI because it controls better while displayport sensitivity feels more dead.
You are pretty much correct, HDMI is a fixed bit rate for the display resolution you are set to, while Display Port is a variable bit rate for the display resolution you are set to. This is why Nvidia and AMD recommend that you use Display Port with G-Sync and Free-Sync.
 
My GPU is Nvidia so don't think freesync works? I like this monitor too, imo can't go wrong with a Dell display. Also not really noticed any VA panel smear either, do i need to look for it to notice it?
Nvidia supports it on some displays, I'm not sure if this one is on the list. If not, and you're running it at constant 144 Hz, that explains it. :)

I haven't noticed any VA smear, either. The only other thing I noticed is a little bit of backlight bleed at high brightness and a full black image, but it becomes unnoticeable as soon as there is something else in the picture, so I don't care. It's a beautiful display with beautiful colours, and I love that it's not super-duper gamery in design. :)

Another Boltgun victim. It's one of those games that seems like it's cool the first 5 minutes, then you notice it's almost identical to playing DOS Doom in terms of map design, weapons, gameplay, etc. Doom is actually probably better. Stuff like Roboquest, Deadlink, and even Unity Gunfire Reborn are all way better, not to mention things like the Borderlands games.
I've only finished the first level so far. Yep, Doom 2 with a W40k skin and ultrawide support through and through (I'm not saying that it's bad, though - sometimes I need the simplicity).

I had flickering issues and black screens, so the first thing I tried, was replacing my cheap 1.4 DisplayPort cable with a 2.1 and all of the issues stopped.
Cheap HDMI cables gave me headaches with my 4K TV before, but I didn't think cable quality could be an issue here. Thanks for the tip, I'll try a new cable. :)

Edit: The 2.1 cables I found seem to be noname and very cheap, but do you think this could work while only being 1.4?
 
Last edited:
Nvidia supports it on some displays, I'm not sure if this one is on the list. If not, and you're running it at constant 144 Hz, that explains it. :)

I haven't noticed any VA smear, either. The only other thing I noticed is a little bit of backlight bleed at high brightness and a full black image, but it becomes unnoticeable as soon as there is something else in the picture, so I don't care. It's a beautiful display with beautiful colours, and I love that it's not super-duper gamery in design. :)


I've only finished the first level so far. Yep, Doom 2 with a W40k skin and ultrawide support through and through (I'm not saying that it's bad, though - sometimes I need the simplicity).


Cheap HDMI cables gave me headaches with my 4K TV before, but I didn't think cable quality could be an issue here. Thanks for the tip, I'll try a new cable. :)

Edit: The 2.1 cables I found seem to be noname and very cheap, but do you think this could work while only being 1.4?

I believe i am using the DP cable that came with the monitor

This cable-
https://www.amazon.co.uk/Hotron-Display-Port-Cable-E246588/dp/B077MKDFG4

Probably a crappy cable, maybe i should try a better one anyway
 
Hi,

I bought a new monitor not long ago. It's a Dell S3422DWG 34" 3440x1440 144 Hz VA ultrawide.

It's an absolutely amazing monitor, but I've noticed that I occasionally get some weird backlight flicker in some games. After looking into it, I found this:


Basically, if you have a Variable Refresh Rate display with Low Framerate Compensation (LFC), and your in-game framerate comes close to, or drops below the display's VRR range (48 Hz in my case), then LFC kicks in, doubles, or triples your refresh rate. So, if you have 50 or 55 FPS, for example, the monitor will switch to 100 or 110 Hz momentarily to stay within VRR range and avoid screen tearing. This comes with blacklight flicker on some displays (mine included), which is extremely annoying.

The question is, is it possible to disable LFC without disabling FreeSync somehow? I generally have no issues playing at 48 FPS, but the flickering makes it really bad. :(

Another question is, what else can I do to mitigate backlight flicker with LFC?

I would like to avoid using CRU and any kind of custom things that aren't part of my display's official support if possible.
Not really a whole lot without increasing the VRR range with CRU that I'm aware of. Other option would obviously be to adjust your graphics settings or using upscalers to keep the FPS above 48. This is an inherent issue on some VRR monitors, especially with VA panels. A better cable is not going to resolve this problem.
 
Last edited:
Not really a whole lot without increasing the VRR range with CRU that I'm aware of. Other option would obviously be to adjust your graphics settings or using upscalers to keep the FPS above 48. This is an inherent issue on some VRR monitors, especially with VA panels. A better cable is not going to resolve this problem.
Someone said in their review of the Club3D cable linked above that it resolved some flickering issues at 165 Hz, so I ordered it. If it helps, great, if it doesn't, then at least I have an extra cable just in case. :ohwell:
 
@r0ach
not everyone plays the same stuff, or has the same priorities.
ignoring for a moment that i havent seen a single test/bench/game on mine (VRR over hdmi 2.1), that wasnt smooth above ~20fps.

@AusWolf
can be, worth the try, 50% of screen issues i had using DP port, were down to cable, even if it had correct specs.

is gpu scheduling enabled in win? try with it off.
 
Back
Top