Monday, January 28th 2019

MSI Monitors are Now G-Sync Compatible

Following NVIDIA's announcement of their newest drivers, MSI monitors are effectively G-Sync Compatible! This technology allows G-Sync to be used on Adaptive Sync monitors. G-Sync, an anti-tearing, anti-flickering and anti-stuttering monitor technology designed by NVIDIA, was once only exclusive to monitors that had passed the NVIDIA certification. With the newest release of NVIDIA GPU driver, NVIDIA now allows G-Sync to be used on monitors that support Adaptive Sync technology when they are connected to an NVIDIA graphics card.
At the moment, not all Adaptive Sync monitors on the market are perfectly G-Sync Compatible. MSI, on the other hand, has been constantly testing Adaptive Sync monitors to determine if they are G-Sync Compatible. Below are the test results.
*Models not listed are still under internal testing

Graphics cards used for the test:
  • MSI RTX 2070 Ventus 8G
  • MSI RTX 2080 Ventus 8G
  • MSI GTX 1080 Gaming 8G
System Requirements for G-Sync Compatibility:
  • Graphics Card: NVIDIA 10 series (pascal) or newer
  • VGA driver version should be later than 417.71
  • Display Port 1.2a (or above) cable
  • Windows 10
Limitations at the moment:
  • Only a single display is currently supported; Multiple monitors can be connected but no more than one display should have G-SYNC Compatible enabled
Add your own comment

22 Comments on MSI Monitors are Now G-Sync Compatible

#1
hat
Enthusiast
Huh, didn't know MSI made monitors...
Posted on Reply
#2
Vayra86
Hahah I like how that last bastion of Gsync is slowly but surely torn down. The best excuse right now is 'compatibility' and 'we need to test it'... and every test result basically says 'it works fine' and whenever it doesn't, that's mostly up to Nvidia's driver.

So much for selling some weird premium form of Gsync at the end of the day, Huang. That ain't gonna fly. You lost this one, now get back to work on that driver TY.
Posted on Reply
#3
cucker tarlson
how about ulmb ? how come you paid a premium for your fg over a normal 120hz monitor. did it use to fly then ? :)
Posted on Reply
#4
Vayra86
cucker tarlson, post: 3983055, member: 173472"
how about ulmb ? how come you paid a premium for your fg over a normal 120hz monitor. did it use to fly then ? :)
Is this directed at me? ('your fg')

If it is- FWIW - I paid 379 EUR for this monitor.... look around for a quality VA high refresh 1080p panel today and I'm not so sure its a premium at all ;) Regardless. There are some arguments to be made here;
- ULMB / strobe only works with fixed refresh rates, so it doesn't mix with Gsync or FreeSync at all. The best Huang could sell today is 'we have a very expensive strobing backlight'. And its not even a USP in todays' market. So its a very awkward proposition to make, when all you offer is a display mode that only works in a very specific use case, and excludes the main selling point of an adaptive sync monitor. They also can't pull an AMD and call it Gsync 2, because there is nothing new to offer. All Gsync monitors have ULMB.
- There are also monitors without Gsync and with strobe, like mine (its also not ULMB, but a slightly different tech/implementation).
- Back then it was one of the first VA gaming panels around. Hard to compare to begin with, most of what you had with high refresh was TN or astronomically priced IPS (equally, or more expensive than this one at the time, and without strobe).
Posted on Reply
#5
yakk
hat, post: 3983000, member: 32804"
Huh, didn't know MSI made monitors...
Buy panel, mold a plastic case, add essential RGB, subcontract assembly, add box & shipping. Use existing marketing and distribution network.

Not a big risk to MSI to expand business.
Posted on Reply
#6
Manu_PT
hat, post: 3983000, member: 32804"
Huh, didn't know MSI made monitors...
Not only they do, but they also have one of the best 144hz monitors money can buy, the Optix MAG241C. 3200:1 contrast, thr best pixel overdrive for VA so far, 125% sRGB, no glow, no colour banding, gsync/freesync and very low input lag.
Posted on Reply
#7
Sasqui
Vayra86, post: 3983052, member: 152404"
Hahah I like how that last bastion of Gsync is slowly but surely torn down. The best excuse right now is 'compatibility' and 'we need to test it'... and every test result basically says 'it works fine' and whenever it doesn't, that's mostly up to Nvidia's driver.

So much for selling some weird premium form of Gsync at the end of the day, Huang. That ain't gonna fly. You lost this one, now get back to work on that driver TY.
And the kicker is that they are selling G-Sync as if it they unleashed some magical unicorn that can do it all, including Adaptive Sync. When in fact it could do all along, they just weren't going to let that happen so they could pluck another $200 from customers pockets. Didn't work out so well.

Go look at NVDA stock price this morning. They cut forward guidance once again, it's down 12% at the moment.
Posted on Reply
#8
DenFox
Manu_PT, post: 3983084, member: 168799"
Not only they do, but they also have one of the best 144hz monitors money can buy, the Optix MAG241C. 3200:1 contrast, thr best pixel overdrive for VA so far, 125% sRGB, no glow, no colour banding, gsync/freesync and very low input lag.
I'm so happy! I'm waiting for an MSI Optix MAG241C and a Gigabyte RTX 2060 Windforce OC with DP 1.4 cable I bought few days ago from Amazon! And now this news!

:peace::peace::peace:
Posted on Reply
#9
Manu_PT
Sasqui, post: 3983085, member: 21173"
And the kicker is that they are selling G-Sync as if it they unleashed some magical unicorn that can do it all, including Adaptive Sync. When in fact it could do all along, they just weren't going to let that happen so they could pluck another $200 from customers pockets. Didn't work out so well.

Go look at NVDA stock price this morning. They cut forward guidance once again, it's down 12% at the moment.
Calm down there. Gsync module still delivers a superior experience! We talking about overdrive tweaking at every refresh rate! We talking about uniform panel scanout at every refresh, brightness control, smooth low frame comp, etc etc.

The gsync module experience is still superior, lets not ignore that. Is or was it ever worth it? Maybe not for most of us :)

But assuming this driver enabled gsync experience is the same, is wrong.
Posted on Reply
#10
Vayra86
Manu_PT, post: 3983094, member: 168799"
Calm down there. Gsync module still delivers a superior experience! We talking about overdrive tweaking at every refresh rate! We talking about uniform panel scanout at every refresh, brightness control, smooth low frame comp, etc etc.

The gsync module experience is still superior, lets not ignore that. Is or was it ever worth it? Maybe not for most of us :)

But assuming this driver enabled gsync experience is the same, is wrong.
Dude, come on. 99% of that is inflated marketing BS and the remaining 1% is never really visible unless you're analyzing the pixels themselves. It also doesn't defeat physics in any way, if you have huge FPS dips, you will have problems no matter what and Gsync or FreeSync makes no difference then. I can bet you right now that in double blind testing no one could see a difference. Even high refresh uncapped versus Gsync is already hard for most. Maybe its not technically the same, but the experience sure is.
Posted on Reply
#11
Sasqui
Manu_PT, post: 3983094, member: 168799"
But assuming this driver enabled gsync experience is the same, is wrong.
We'll see. NVidia is still the antithesis of an open development environment. Unless it suits them.
Posted on Reply
#12
Manu_PT
Vayra86, post: 3983098, member: 152404"
Dude, come on. 99% of that is inflated marketing BS and the remaining 1% is never really visible unless you're analyzing the pixels themselves. It also doesn't defeat physics in any way, if you have huge FPS dips, you will have problems no matter what and Gsync or FreeSync makes no difference then. I can bet you right now that in double blind testing no one could see a difference. Even high refresh uncapped versus Gsync is already hard for most. Maybe its not technically the same, but the experience sure is.
It depends on the model my friend. You can clearly see the response time going nuts when there are a lot of framerate dips on some models. While with a Gsync module the overdrive stays consistent amongst the entire range.

And this doesn´t mean I don´t agree with you; For example, Nvidia considers Asus XG248Q as a Gsync compatible on their official list. If you try Gsync on that monitor and compare with let´s say a Samsung FG73 the difference is clear. A Gsync module monitor still offers a superior VRR experience that can´t be ignored.

You can also notice how many times a no Gsync Module monitor hits the ceilling, unless you cap to 120fps. While on a Gsync module monitor you can get away with a 141fps cap and be assured it will never hit the ceilling. Hitting the ceilling will cause either:

- tearing
- input lag

It is more precise when you have a module, while with the driver the monitor adjustements are all over the place, again, depending on the model. You can notice it more on some compared to others.

You can visit Blur Busters if you are interested in reading the differences. With that being said, Gsync is still too expensive and that´s the problem. Getting just a good FreeSync monitor that works well with driver enabled gsync is a much smarter option. But simply saying this driver Gsync is the same as the module one, is not right, trust me.
Posted on Reply
#13
nickbaldwin86
Sasqui, post: 3983085, member: 21173"
Go look at NVDA stock price this morning. They cut forward guidance once again, it's down 12% at the moment.
Stop using stock market prices and percents for an argument. it is silly and doesn't reflect anything.

Oh compared to what? the stock of AMD which is worth as much as $115 less than that of NVIDIA? LOL yeah great job there. oh AMD is also currently at a 7% drop.

We all know the mining craze of 2018 HUGELY inflated the stock for NVIDIA and it has been dropped because that craze is long over.

I don't think that stock prices and G-Sync being free is all of the sudden a similarity.

I run NV and G-sync. I haven't seen reviews saying the experience on NV and Freesync is favorable over NV and G-sync. They say it is the cheaper monitor option but if you want the best you will pay for the best. I am sure more YouTubers will be making more NV and Freesync videos as this ramps up onto more monitors.

Wait for the right deals and don't buy things on day 1 release ;) I picked up mine on a killer deal and no regrets.

If it was equal NVIDIA would or will stop making G-SYNC all together, guess we will wait and see what happens. the price of G-sync monitors will likely drop as well.
Posted on Reply
#14
Mistral
Bloody nVidia marketing. "nVidia cards now are VRR capable on MSI monitors" is how the title should read.
Posted on Reply
#15
EarthDog
Soooooooooo did NVIDIA test this or is this MSI making the claims? If the latter I wonder if they used the same testing as NVIDIA does to certify these (the 'hundreds' of tests)?
Posted on Reply
#16
Sasqui
nickbaldwin86, post: 3983226, member: 99503"
Stop using stock market prices and percents for an argument. it is silly and doesn't reflect anything.
It's a statement. Crypto hangover that they're still nursing and blaming lowered guidance on China. China was mining like crazy with video cards and ASICs, so yeas, they're buying less video cards too, a lot less. Yes, AMD got hit as well, for sure.
Posted on Reply
#17
EarthDog
Wouldn't AMD get hit more? Wasn't it their cards that were more efficient and had a quicker ROI?
Posted on Reply
#18
Sasqui
EarthDog, post: 3983275, member: 79836"
Wouldn't AMD get hit more? Wasn't it their cards that were more efficient and had a quicker ROI?
AMD's revenue is around 25% graphics. NVidia's is around 50% graphics. So no.
Posted on Reply
#19
EarthDog
They sure seemed to have a lot more cards going at it... its the reason their price was inflated so much!
Posted on Reply
#20
Sasqui
EarthDog, post: 3983315, member: 79836"
They sure seemed to have a lot more cards going at it... its the reason their price was inflated so much!
Yea on both teams. I believe (with nothing to back it up lol) that NVidia ramped up production a lot more than AMD did. And rightfully so, given the demand.
Posted on Reply
#21
John Naylor
Vayra86, post: 3983057, member: 152404"
- ULMB / strobe only works with fixed refresh rates, so it doesn't mix with Gsync or FreeSync at all. The best Huang could sell today is 'we have a very expensive strobing backlight'. And its not even a USP in todays' market. So its a very awkward proposition to make, when all you offer is a display mode that only works in a very specific use case, and excludes the main selling point of an adaptive sync monitor. They also can't pull an AMD and call it Gsync 2, because there is nothing new to offer. All Gsync monitors have ULMB.
Couple of clarifications ...


It's not intended to, at least at present, tech hasn't advanced that far. You turn off G-Sync to use ULMB ... most find the break off point to be between 70 and 80 *minimum* fps, but in speaking with users we've done builds for who game a lot more than i have time to .... most of them with the necessary GFX horsepower are using 1440p IPS AUOptronics 8 or 10 bit screens and using ULMB almost exclusively. It's not as if you use the 100 Mhz or 120 Mhz setting and if you have 90 fps all goes to hell. Look at the reviews on TFtcentral... not having ULMB is considered a major con.

3.5+ years old but still relevant ...

http://www.tftcentral.co.uk/articles/variable_refresh.htm
At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. Very recently NVIDIA has added the option to choose how frequencies outside of the supported range are handled. Previously it would revert to Vsync on behaviour, but the user now has the choice for Vsync on or off.
The only thing new about G-Sync that I can recall was adding windowed mode and making more ports available. Wasn't that 2 or II thing a mistake ? I remember something about Dell or someone announcing and then retracting a claim in this regard .... but hey I'm old, :)

All G-Sync monitors do not have ULMB.

http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg348q.htm#conclusion
http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm#conclusion
Posted on Reply
#22
Vayra86
John Naylor, post: 3984026, member: 156078"
Couple of clarifications ...


It's not intended to, at least at present, tech hasn't advanced that far. You turn off G-Sync to use ULMB ... most find the break off point to be between 70 and 80 *minimum* fps, but in speaking with users we've done builds for who game a lot more than i have time to .... most of them with the necessary GFX horsepower are using 1440p IPS AUOptronics 8 or 10 bit screens and using ULMB almost exclusively. It's not as if you use the 100 Mhz or 120 Mhz setting and if you have 90 fps all goes to hell. Look at the reviews on TFtcentral... not having ULMB is considered a major con.

3.5+ years old but still relevant ...

http://www.tftcentral.co.uk/articles/variable_refresh.htm



The only thing new about G-Sync that I can recall was adding windowed mode and making more ports available. Wasn't that 2 or II thing a mistake ? I remember something about Dell or someone announcing and then retracting a claim in this regard .... but hey I'm old, :)

All G-Sync monitors do not have ULMB.

http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg348q.htm#conclusion
http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm#conclusion
So the rabbit hole is a whole lot deeper then. Its one big inconsistent mess with Gsync monitors at this point. I see. Thx

As for ULMB/strobe. Yes I can only agree, I really don't miss the variable refresh until I drop under 65-70 FPS on my current monitor, and I have strobe active all the time. I consider strobing a much stronger feature than variable refresh for any high end spec rig tbh.
Posted on Reply
Add your own comment