• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GSYNC lowering GPU Usage Resulting in Lower FPS?

What's the problem?

I think his issue is his monitor is 144Hz, so he is under the impression that he should be seeing the maximum possible framerate as long as it is under 144FPS. But that isn't how Gsync works. It still has to do some syncing with the monitor, and that causes an overhead that reduces framerates.

Gsync is designed to minimize the delay from when the frame is rendered and the monitor displays it. This reduces tearing while also reducing input lag.

Compare this to vsync, where the GPU might render a frame, but then have to wait before sending it to the monitor so that it syncs with the static refresh rate of the monitor. This introduced a lot of input lag.

But this whole thread is the perfect practical example of why I'm convinced Gsync is worthless for any high end rig. OP has also concluded that effectively Gsync is pointless right there in the TS. Too bad it took a purchase of it to get there ;)

I disagree, but agree. I think it depends on if you are sensitive to input lag or not. If you are a person that notices input lag, then Gsync and Freesync are great. But for most people with high end rigs that aren't sensitive to input lag, that are also paired with a high refresh rate monitor, adaptive vsync is the better(and way cheaper) option in my opinion.
 
I think his issue is his monitor is 144Hz, so he is under the impression that he should be seeing the maximum possible framerate as long as it is under 144FPS. But that isn't how Gsync works. It still has to do some syncing with the monitor, and that causes an overhead that reduces framerates.

Gsync is designed to minimize the delay from when the frame is rendered and the monitor displays it. This reduces tearing while also reducing input lag.

Compare this to vsync, where the GPU might render a frame, but then have to wait before sending it to the monitor so that it syncs with the static refresh rate of the monitor. This introduced a lot of input lag.



I disagree. If you are person that notices input lag, then Gsync and Freesync are great. But for most people with high end rigs that aren't sensitive to input lag, that are also paired with a high refresh rate monitor, adaptive vsync is the better(and way cheaper) option in my opinion.

I feel Adaptive sync and Fast Sync pretty much cover all bases together. The real advantage of Gsync or FreeSync is below 60 fps, and if you have a high end rig that can't reliably push 60, just tweak a few things to make it so = stable end result at no cost and above all, you don't have to deal with the myriad of issues surrounding Gsync in the first place, you get to use monitor strobe functionality to improve motion resolution/reduce blur, etc etc etc.

If you are highly sensitive to input lag, you'd be making sure you have as little FPS variance as possible because that in itself is input lag at the most basic level. You'd also want the highest possible min. fps. Gsync doesn't really fit in there, its only added purpose is to eliminate tearing, which, in a high refresh monitor at 60+ fps, hardly ever happens anyway.
 
Correct me if I am wrong but G-Sync helps making the frames match on both the monitor and gpu so tearing ain't shown and ofc it will be a limitation at some point but still when u get help synchronizing the frames send out from your graphics card to what's shown on your monitor ofc ur gpu will have less work to do depending on the game u r running.

I have noticed when gaming overwatch I actually has gotten a better average fps with my setup getting a g-sync monitor.
 
This is exactly what GSync is supposed to do... lower the framerate (and by proxy, GPU utilization if your GPU's are overpowered) to match the monitor's refresh rate, and conversely, to lower the monitor's refresh rate to match the GPU's framerate if the framerate dips below the monitor's max refresh rate.

What's the problem?

I guess I was just being too paranoid about my GPU utilization being lower than what my monitors max refresh rate. As I've stated, with GSYNC off, both cards can achieve of 99% utilization. But not when GSYNC is enabled.

If you're being realistic, when you can hit above 100 FPS consistently with any setup, Gsync becomes a problem rather than a solution.

Remove Gsync, use Fast Sync, and you're tear free and maximizing FPS. If your FPS fluctuates too much, use Gsync and accept what it does for you :)

But this whole thread is the perfect practical example of why I'm convinced Gsync is worthless for any high end rig. OP has also concluded that effectively Gsync is pointless right there in the TS. Too bad it took a purchase of it to get there ;)

Yeah, I'm never buying another GSYNC panel again. I'm just going to wait for HDMI 2.1 to arrive and let that fix the problem. I've had this monitor for a year now and I never saw this problem with Maxwell (GTX 980 SLI). But next time I'm thinking of just buying an ultrawide 1440P monitor.

Correct me if I am wrong but G-Sync helps making the frames match on both the monitor and gpu so tearing ain't shown and ofc it will be a limitation at some point but still when u get help synchronizing the frames send out from your graphics card to what's shown on your monitor ofc ur gpu will have less work to do depending on the game u r running.

I have noticed when gaming overwatch I actually has gotten a better average fps with my setup getting a g-sync monitor.

There in lies my problem. When I have GSYNC enabled, both my 1080s utilization is just around 80% - 90%, but when GSYNC is disabled, both 1080s are pegged on 99% usage on any game. I mean it would dip at 95%, but that's about it. I just don't understand that part lol. But it seems as though it is perfectly normal.
 
There in lies my problem. When I have GSYNC enabled, both my 1080s utilization is just around 80% - 90%, but when GSYNC is disabled, both 1080s are pegged on 99% usage on any game. I mean it would dip at 95%, but that's about it. I just don't understand that part lol. But it seems as though it is perfectly normal.

With Gsync disabled, your graphics card is going to spit out as many frames as possible, and your GPU usage will be at or close to 100%. The problem with this is that your monitor has a max refresh rate. It can only do a maximum framerate, and if your GPU goes above that maximum framerate, you will get screen tearing. VSync and GSync are technologies designed specifically to make sure that this doesn't happen. Your GPU is showing lower utilization because it IS being utilized less, because your framerate is being lowered to whatever the max framerate is. This is the exact same thing that happens with VSync. Both technologies do the same thing. (sort of... nobody get technical in here and crucify me, I'm trying to simplify things...)

Gsync ALSO has the added benefit of doing the opposite... it lowers the screen's refresh rate to match the graphics card's framerate IF the framerate drops below the monitor's max refresh rate. This results in no input lag, which can be a problem with VSync.

So your graphics card is doing EXACTLY what it is supposed to do. It is slowing itself down to match the monitor's framerate. That is why you see less utilization. When you turn GSync off, your card is spitting out more frames than your monitor can take, and you NEVER see those extra frames. It's physically impossible for framerate above your monitors maximum to help you. If your monitor's max refresh rate is 144hz, and your graphics card is spitting out 200 fps at 100% utilization, you are NOT seeing those extra frames. They are just resulting in screen tearing. When you turn GSync on, it slows your graphics card down to match the maximum refresh rate of your monitor. This results in the highest possible visible framerate while eliminating screen tearing.

You're seeing a ghost here. There is nothing wrong with the fact that your GPU is only being used 80% while GSync is on. That is a GOOD thing, and means that your graphics card has more horsepower than your monitor can handle. This means that if you play a more demanding game, or crank up the settings, you've still got that extra horsepower in reserve, and can still meet the maximum framerate of your monitor. It also means that your graphics card is running cooler, and not working as hard, for no reason and spitting out extra frames that you can't even see because your monitor can't display them. This could potentially mean it would extend the graphics card's life.

In short, that extra 20% of graphics power that isn't being used when GSync is on will NOT help you at all. You can't see the extra frames provided by that power, because your monitor can't display them.

Do yourself a favor, and turn GSync on, and never look at that utilization percentage number again. You're completely misunderstanding what is going on behind the scenes, and it's causing you pain. Turn GSync on, enjoy nice smooth framerates at the maximum that you could see with your monitor anyway, and have fun. The utilization percentage means nothing in this case.
 
With Gsync disabled, your graphics card is going to spit out as many frames as possible, and your GPU usage will be at or close to 100%. The problem with this is that your monitor has a max refresh rate. It can only do a maximum framerate, and if your GPU goes above that maximum framerate, you will get screen tearing. VSync and GSync are technologies designed specifically to make sure that this doesn't happen. Your GPU is showing lower utilization because it IS being utilized less, because your framerate is being lowered to whatever the max framerate is. This is the exact same thing that happens with VSync. Both technologies do the same thing. (sort of... nobody get technical in here and crucify me, I'm trying to simplify things...)

Gsync ALSO has the added benefit of doing the opposite... it lowers the screen's refresh rate to match the graphics card's framerate IF the framerate drops below the monitor's max refresh rate. This results in no input lag, which can be a problem with VSync.

So your graphics card is doing EXACTLY what it is supposed to do. It is slowing itself down to match the monitor's framerate. That is why you see less utilization. When you turn GSync off, your card is spitting out more frames than your monitor can take, and you NEVER see those extra frames. It's physically impossible for framerate above your monitors maximum to help you. If your monitor's max refresh rate is 144hz, and your graphics card is spitting out 200 fps at 100% utilization, you are NOT seeing those extra frames. They are just resulting in screen tearing. When you turn GSync on, it slows your graphics card down to match the maximum refresh rate of your monitor. This results in the highest possible visible framerate while eliminating screen tearing.

You're seeing a ghost here. There is nothing wrong with the fact that your GPU is only being used 80% while GSync is on. That is a GOOD thing, and means that your graphics card has more horsepower than your monitor can handle. This means that if you play a more demanding game, or crank up the settings, you've still got that extra horsepower in reserve, and can still meet the maximum framerate of your monitor. It also means that your graphics card is running cooler, and not working as hard, for no reason and spitting out extra frames that you can't even see because your monitor can't display them. This could potentially mean it would extend the graphics card's life.

In short, that extra 20% of graphics power that isn't being used when GSync is on will NOT help you at all. You can't see the extra frames provided by that power, because your monitor can't display them.

Do yourself a favor, and turn GSync on, and never look at that utilization percentage number again. You're completely misunderstanding what is going on behind the scenes, and it's causing you pain. Turn GSync on, enjoy nice smooth framerates at the maximum that you could see with your monitor anyway, and have fun. The utilization percentage means nothing in this case.

Thanks for this. I have been ignoring that fact actually. I even uninstalled RivaTuner just so I don't get the urge to turn the OSD on.
 
Back
Top