• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is there a difference between 144Hz and 144Hz Gsync/Freesync ?

In theory both panels should look and operate the same, That is in theory not practice.
What about minimal refresh rate supported by a panel? That info is crucial for free sync and completely non relevant for g-sync ...
 
What about minimal refresh rate supported by a panel? That info is crucial for free sync and completely non relevant for g-sync ...
Well when you go below the minimum refresh rate, g-sync starts doing doubleing, tripleing, etc depends on frame rate. Like if fps is 25fps, then g-sync doubles each frame draw so montior runs 50hz. If it goes say 17fps then it will do triple frame so monitor runs at 51hz. Freesync will if monitor has a floor VRR of 35hz then it will keep refreshing at 35hz no matter what fps is, so there will be tearing and stuttering. AMD could add in to their drivers to do that frame doubling so it would do same as what g-sync module does but if that ever happens is up to AMD.

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Can read up and they have a video explanation of it.
 
judder is very obvious too
Evenly paced frames at 72 fps should be pretty smooth ... with single gpu that is, but you have SLI so that's another story judder wise ... you can try messing with maximum pre-rendered frames = 1 and triple buffering = on in nvcp to see if it helps. Also when using adaptive v-sync, don't forget turn off the in-game v-sync setting.
Well when you go below the minimum refresh rate, g-sync starts doing doubleing, tripleing, etc depends on frame rate. Like if fps is 25fps, then g-sync doubles each frame draw so montior runs 50hz. If it goes say 17fps then it will do triple frame so monitor runs at 51hz. Freesync will if monitor has a floor VRR of 35hz then it will keep refreshing at 35hz no matter what fps is, so there will be tearing and stuttering. AMD could add in to their drivers to do that frame doubling so it would do same as what g-sync module does but if that ever happens is up to AMD.
Yeah I know, that's why I said that minimal refresh rate is non issue for g-sync ... what I meant is that both panels should operate the same only if the frame rate stays in the certain range.
 
Yeah I know, that's why I said that minimal refresh rate is non issue for g-sync ... what I meant is that both panels should operate the same only if the frame rate stays in the certain range.
They should but when you get to higher rates like 85hz+ freesync has had reported ghosting issues which. That is down to voltage control issue of the hardware in the monitor which has a lot of work that needs to be done. If you notice that some new freesync panels, when freesync mode is off they can go 144hz but when you enable freesync it max's out to 90hz. That is to minimize the ghosting. AMD left that issue on the monitor makers to solve which if you can see ghosting if you can test one. It would be best to wait on buying one for a while til they work it out more.
 
AMD left that issue on the monitor makers to solve
Limit to 90 Hz to minimize ghosting = problem solved :laugh: and they are not even that much cheaper than g-sync monitors ... gotta love monitor makers
 
I have actually done a little test while I was switching between ULMB and G-Sync to see how good ULMB is.
For this I tested Battlefield 4 at everything maxed with only MSAA off.
I also overclocked my 2x 970s to about 1400MHz to maintain a frame-rate of 120+

I have to say that G-Sync is amazing even if your frame rates hardly drop below 120
I personally notice the difference between 120FPS with G-Sync off and 120FPS with G-Sync on.
Before getting my monitor I watched people on youtube say things like "you wont believe how good G-Sync is until you have seen it for yourself" and I would be like it cant be that amazing, until I got my monitor and saw how good variable refresh rate is.

I love variable refresh rate and am so happy I made the purchase :)
 
Limit to 90 Hz to minimize ghosting = problem solved :laugh: and they are not even that much cheaper than g-sync monitors ... gotta love monitor makers
yea it was easyest way to fix the problem instead of spending months or even could be a year to solve it. Nvidia spent A lot of time testing panels to see which respond well to doing VRR, if you buy g-sync monitor it uses one those panels nvidia certified themselves to work with it like they intended. Freesync will eventually work at 144hz, but kw is "eventually".
 
yea it was easyest way to fix the problem instead of spending months or even could be a year to solve it. Nvidia spent A lot of time testing panels to see which respond well to doing VRR, if you buy g-sync monitor it uses one those panels nvidia certified themselves to work with it like they intended. Freesync will eventually work at 144hz, but kw is "eventually".

Every G-Sync panel tested by TFT Central has ghosting. G-Sync also suffers from inversion artifacts.
 
Every G-Sync panel tested by TFT Central has ghosting. G-Sync also suffers from inversion artifacts.
That's expected minimal amount of ghosting every panel has. Problem is that some free sync monitors need firmware upgrade for its overdrive control to properly reduce ghosting while using dynamic refresh rate to those expected minimal levels. (link)
 
yea it was easyest way to fix the problem instead of spending months or even could be a year to solve it. Nvidia spent A lot of time testing panels to see which respond well to doing VRR, if you buy g-sync monitor it uses one those panels nvidia certified themselves to work with it like they intended. Freesync will eventually work at 144hz, but kw is "eventually".
Are you just trying to make things up as you go??? Freesync DOES work at 144hz, the Acer XG270HU has a range of 40-144hz for Freesync (A monitor I have/am using currently) on my system and it has no issues enabling Freesync at 144hz and moving around the FPS in BF4 without anything being noticeable. The "Ghosting" issue is the same issue any monitor can have as @Xzibit said
Every G-Sync panel tested by TFT Central has ghosting. G-Sync also suffers from inversion artifacts.
It has the same issues that can appear...

Limit to 90 Hz to minimize ghosting = problem solved :laugh: and they are not even that much cheaper than g-sync monitors ... gotta love monitor makers
The Acer XG270HU vs Rog Swift is a common comparison since their specs are roughly the same and picture quality, that is a 250 dollar difference according to newegg's prices currently.

Also this is off subject anyways... The OP asked if having variable refresh makes a difference not the difference between specifically G-Sync and Freesync, the difference without either of them...
 
Are you just trying to make things up as you go??? Freesync DOES work at 144hz, the Acer XG270HU has a range of 40-144hz for Freesync (A monitor I have/am using currently) on my system and it has no issues enabling Freesync at 144hz and moving around the FPS in BF4 without anything being noticeable. The "Ghosting" issue is the same issue any monitor can have as @Xzibit said
I'm not making up anything. Yes there are SOME that do 144hz but they will have noticeable ghosting over ones that limited the panel to 90hz. It is a short term fix til they fix it at 144hz, But IMO if i am buying 144hz panel being locked to 90hz is bs.

Every G-Sync panel tested by TFT Central has ghosting. G-Sync also suffers from inversion artifacts.
I bet panels that are fixed rates still have ghosting to an extent, but issue is how bad it is compared to other tech.
Also this is off subject anyways... The OP asked if having variable refresh makes a difference not the difference between specifically G-Sync and Freesync, the difference without either of them...
In an earlier post i said in theory both would work and look the same but what is in theory isn't what might work in practice.
 
I bet panels that are fixed rates still have ghosting to an extent, but issue is how bad it is compared to other tech.

Viewsonic VP2780-4k
4k 10-bit+ (10bit+A-FRC) 60hz AH-IPS better than any G-Sync/FreeSync in ghosting for $799.99 and no inversion issues.


http://www.tftcentral.co.uk/images/pixperan/viewsonic_vp2780-4k.jpg
viewsonic_vp2780-4k.jpg
 
Last edited:
Evenly paced frames at 72 fps should be pretty smooth ... with single gpu that is, but you have SLI so that's another story judder wise ... you can try messing with maximum pre-rendered frames = 1 and triple buffering = on in nvcp to see if it helps. Also when using adaptive v-sync, don't forget turn off the in-game v-sync setting.
Thanks for the help, but there isn't actually a fault here. The exact way it looks also depends on the refresh rate of the monitor, the type of monitor and one's vision. Let me explain.

Setting vsync to refresh rate means that you get this:

Frame / Movement

1 Move
2 Move
3 Move
4 Move
etc

This leads to perfect movement on a strobed display. Non strobed produces significant motion blur regardless of brand or model, but motion can still look smooth.


Half refresh rate vsync means that you get this:

Frame / Movement

1 Move
2 Stop
3 Move
4 Stop
etc

As you can see the picture stands still for one frame every other frame, hence this stop-start motion is what leads to visible judder. Motion blur as above for strobed / unstrobed displays.

Now, the way this stop-start motion is perceived depends on various things as described below.


60Hz monitor refresh

Highly visible judder. Very annoying to look at. Lots of motion blur unless strobed.


120/144Hz

Smooth motion, but with edge doubling and motion blur - essentially two pictures superimposed on top of each other. I've not tried 100Hz, which presumably might look someway between judder and smoothness. Might try it at some point, but it's too late at night now to bother with it.


120/144Hz strobed (CRT like)

Smooth motion, but with edge doubling. Picture is sharp (no motion blur) but the superimposed images reduce clarity significantly. It's similar to looking at a 3D Vision display without the glasses, except that the two images are identical.



In all three cases, the motion is actually juddering, but at higher refreshes it's perceived differently as smooth motion, but with a doubled picture. The only time you get perfect motion is with vsync and movement every frame along with a strobing backlight. This includes 60Hz strobed displays (usually a CRT) although it looks noticeably smoother and more refined at 120Hz strobed.

I see that unfortunately you only have a 60Hz monitor so you can't check out what I've explained and see for yourself. If you can get your paws temporarily on a 120Hz monitor with strobing backlight I think it would be well worth it.

I tested the above with SLI off as with it on, there can indeed be some microstutter at times to muddy the waters.
 
I have a BENQ XL2730Z with Freesync. I play only CSGO, so for a strenuous test of the monitor, this is probably not the right game. My PC is a 5930X Intel, MSI X99S MPOWER, 32GB RAM, SSD/HDD, AX860i. I was using a Sapphire Radeon R9 295x2 and getting 290-299FPS, but substituting an MSI R9 290 Gaming (4GB) also gets around 290-299 FPS at peak. The game average FPS is all over the map due to something in Catalyst, I am sure. Once I replaced the DVI-D connection with a DP1.2 cable connection, both cards cause Catalyst to ask if I want to activate Freesync. Once I do, the FPS drops down a bit but lows are no longer in the 30s and 40s, but in the 80s and 90s, and highs rarely get above 200 FPS. BUT....the game is WAY smoother, and actually looks MUCH better due to the sync. I run the monitor at 144Hz, so I am asking the best from it, but the sync seems to make quite the difference. Since the R9 290 seems to do as well as the 295x2, I am now using only that card for CSGO since the 295x2 is overkill and quite the power hog. I am building a second gamer, and it will have GTX980Ti cards in SLI, and I fully expect to try the ASUS ROG G-sync 1440p monitor on that.

Prior to this I was gaming on XFX 7970s in Crossfire using a Samsung 305T, 2560x1600. although that also provides massive FPS in CSGO is also has a bit of tearing. The Samsung does not have the same response time either, and its scan in fixed. I can say that the use of a sync-related solution seems to truly increase my enjoyment of the game, but its a pretty penny to get there. But gaming now at 1440p or higher is very spoiling, and I have a hard time using a FULL HD screen. Oops.

By the way, even when not gaming, and doing other work on the PC, the BENQ XL2730Z is amazing. Such a great display, and so well packaged. I am entertaining another one for my wife's work PC at home. Its really bright, and helps her focus.
 
Yes, it SO is the right word.

I've clarified that a bit now in my post, but you can see that's what I was driving at if you read it carefully. I had actually mentioned vsyncing in the last sentence anyway.

Drop frames with vsync on and you get judder and possibly hitching too. Actually, you get judder with vsync off too, but it will include screen tearing and those other artifacts too.


THIS, is the actual definition of judder relating to that which is seen on displays.

http://whatis.techtarget.com/definition/judder

What you're referring to is not at all caused by the same thing, nor does it even have the same look.
 
That's exactly what judder is. I guess we're interpreting it differently than you, Frag Maniac. 24 -> 60 -> hosed frame rate.
 
THIS, is the actual definition of judder relating to that which is seen on displays.

http://whatis.techtarget.com/definition/judder

What you're referring to is not at all caused by the same thing, nor does it even have the same look.
That's just one particular example example of judder. Yes, it's quite obviously juddery and doesn't negate my point at all, it complements it. I've explained it all in post 39, did you not see it?

The whole point of us enthusiasts spending big money on powerful graphics cards and powerful CPUs is to get rid of judder in our games so I don't understand why this concept is so hard to get. All one needs to know is if the GPU drops frames then the movement judders, it's as simple as that. This is the whole point of G-Sync and FreeSync to reverse the synchronization so one doesn't see this artifact.

Finally, when one goes to the movies, one can see judder (or "strobing" as movie makers like to call it) all the time, especially in panning shots. This is because the movie is shot at 24fps, but the projector runs at 48fps to reduce flicker, leading to that stop-go motion I described above and hence judder. Since the framerate is quite slow, the judder becomes really obvious. I guess I could have added this example to my explanation so people could have related to it and thus understood what I was saying better, perhaps.

Another example are those new 60Hz YouTube videos. They look so much smoother, because on a 60Hz display there are no dropped frames.

Finally, www.blurbusters.com has a section on this too and you can play around with a test of it, here: www.testufo.com/#test=framerates I recommend using Chrome for the most stable testing and lack of hitching.
 
I'm not making up anything. Yes there are SOME that do 144hz but they will have noticeable ghosting over ones that limited the panel to 90hz. It is a short term fix til they fix it at 144hz, But IMO if i am buying 144hz panel being locked to 90hz is bs.

There is only one (available at this time) monitor that is rated for 144hz that stops at 90hz on the Freesync, the Asus MG279Q which its range is 35hz - 90hz.
The Acer XG270HU is rated at 144hz and its range is 40-144hz
The BenQ XL2730Z is rated for 144hz and its range is 40-144hz

The only other available Freesync monitor is the LG 29UM67P but its and ultra wide monitor rated for up to 75hz with a Freesync Range of 48-75hz (There is another Ultra Wide I believe with a similar range). Of all the Freesync available monitors the Asus is the only one that has the range limited currently to 90hz even though its max refresh is 144hz. So this is not some, this is most of the monitors that have a 144hz range.

Viewsonic VP2780-4k
4k 10-bit+ (10bit+A-FRC) 60hz AH-IPS better than any G-Sync/FreeSync in ghosting for $799.99 and no inversion issues.


http://www.tftcentral.co.uk/images/pixperan/viewsonic_vp2780-4k.jpg
viewsonic_vp2780-4k.jpg
Yea but at 60hz it seems to be almost a waste at times even if I am considering waiting for a Freesync 4K monitor with a decent range. I would say most of where these techs make their viability (At least on being awesome) is in the higher refresh ranges. But that is one heck of an expensive monitor lol.
 
Last edited:
@GhostRyder That Asus 90Hz limit is retarded and a complete dealbreaker for me.
 
That's just one particular example example of judder.

Actually no, you're trying to redefine judder as it applies to displays, and not really doing so well. Common with gamers I find. :rolleyes:
 
Actually no, you're trying to redefine judder as it applies to displays, and not really doing so well. Common with gamers I find. :rolleyes:
I've taken the time to give you a solid explanation over two posts in an intelligent manner. If you're going to try and rebutt it you'll have to do better than being argumentative with a cheap put down and an eyeroll. It's you that's not doing so well.

Judder is judder, whether the source material is CGI of any form, film or other source. It's that stop-go motion that causes it, end of story. And those examples I gave above I actually used my own high refresh monitor to investigate it before posting. Get a similar monitor and then come back to me, otherwise you just don't have a clue as you haven't seen it. Heck, you should be able to see the 30/60Hz stuff at least anyway.
 
@GhostRyder That Asus 90Hz limit is retarded and a complete dealbreaker for me.
Yea not my favorite, I currently am playing games (Borrowing a friends) on the Acer Freesync monitor. Wanted to decide if I liked having Freesync (or G-Sync) and 144hz at 1440p over 4k 60hz.
 
I've taken the time to give you a solid explanation over two posts in an intelligent manner. If you're going to try and rebutt it you'll have to do better than being argumentative with a cheap put down and an eyeroll. It's you that's not doing so well.

I've seen your so called explanations, and my statement still stands. You're taking what the word judder as applied to displays was originally called completely out of context, and using your loose description of micro stutter caused by completely different sources as a valid explanation.

Given that you are ignoring that, yes, the eye roll was appropriate, and it wasn't stated as a put down, just an observation. It's the kind of dime a dozen redefining of terms gamers often engage in on forums. If you feel insulted by that assessment, that's you being overly sensitive.
 
I've seen your so called explanations, and my statement still stands. You're taking what the word judder as applied to displays was originally called completely out of context, and using your loose description of micro stutter caused by completely different sources as a valid explanation.

Given that you are ignoring that, yes, the eye roll was appropriate, and it wasn't stated as a put down, just an observation. It's the kind of dime a dozen redefining of terms gamers often engage in on forums. If you feel insulted by that assessment, that's you being overly sensitive.
No, you just don't understand and keep claiming that I'm seeing it from a "gamer's perspective" and "redefining terms" when I've explained that I'm not and exactly why. And yeah, it's kind of annoying. Judder is a real simple concept to understand and I've given you a gold plated answer explaining it, so I really don't know why you're making such a meal of it.

Why don't you actually click on that link I supplied and you'll see the judder on your 60Hz display? I recommend using Chrome as it renders the test properly. You might actually learn something about this and stop arguing with me.

Finally, I'm curious, do you think I'm seeing microstutter since I have two graphics cards in SLI and confusing it with judder?
 
Last edited:
Back
Top