Shouldn't Nvidia's GPU's need to be Adaptive Sync(Freesync) compitable, as those monitor already passed Adaptive Sync(Freesync) compitability test??
I think Nvidia trying to give bad name to Adaptive Sync(Freesync) when their superior GPU arcitecture(!!) dont support adaptive sync properly. Maybe thats why Nvidia need external module(thats comsume more power).!!!
Like many things, the view depends very much on where you are sitting. If you are spending $125 - $200 on your GFX card, you are facing an entirely different situation than when you are spending $500 - $700. The problem is, you can't take a $200 mindset intpo a $500+ card monitor evaluation. To rephrase the words of of the indomitable Indigo Montoya ... when you say "G-Sync" and "hardware module" ... I have to respond ... "those words don't mean what you think they mean."
After all these years ... the illusion that G-Sync and Freesync are two names for "the same thing" remains. I still shake my head every time I hear this ... my wife still thinks she drives a 4WD SUV when she actually has an AWD and no amount of explanation will deter her from this viewpoint despite the fact that I have explainjed it dozens of times, showed her the written articles and .... towed her 4 times already with my 4WD SUV when she gets stuck in the snow. It's not a perfect analogy but best i can think of at the moment.
-G-Sync = 4WD - Toyota Land Cruiser / Jeep Wrangler (sends power to all four wheels equally with locking differentials and without vectoring ... each wheel will spin at the same constant rate as all the others ... the vehicles are normally in 2 WD or AWD mode ... to get to 4WD, you stop and switch to 4WD locked mode. Pick one or the other.
-Freesync = AWD - Honda CRV / Hyundai Tuscon ... no lock, no switching, system does best it can with what it's got. Both can run in AWD mode, only true 4WD can sitch from modes that are mutually exclusive.
A. Adaptive Sync - When initially tested against one another, the two methods of Adaptive Sync were described in reviews as follows:
a) Freesync works extremely well from 40 - 60 fps ..... after 60 fps, it still performed well but the impact on the user experience starts to tail off after 60 fps
b) G-sync works extremely well from 30 - 60 fps ..... after 60 fps, it still performed well but the impact on the user experience starts to tail off after 60 fps
Nvidia's "external module" for one, is not "external" .... and t has nothing to do with adaptive sync. When you use "G-Sync",
the hardware module is disabled .... non-functional ... doing nothing .... might as well not be there.
B. Motion Blur Reduction (MBR) - G-sync monitors also include a hardware module that provides MBR technology called Ultra Low Motion Blur (ULMB). What this does is strobe the backlight to eliminate motion blur. You can see the effect here:
In the 1st part of a video you see a flash card like display similar to an old alarm clock; as it changes from 07 to 08 to 09 ... in alternating black and white as each new number appears, the old number's image is still on screen. What ULMB does is turn off the backlight between frames to get rid of the ghosted images. When nVidia released G-Sync, AMD took the "almost as good but cheaper" marketing route. And it is a great approach in the lower priced segments. With that approach , including the strobing hardware module was out of the question or the cost argument disappears. So Freesync provided the adaptive sync part but skipped the hardware part .... so Freesync users have no MBR option ... at least as part of the "Freesync package".
To counter that deficiency, many vendors included their own hardware module/ strobing technology (ASUS ELMB, BENQ DyAc , BENQ Blur Reduction, EIZO Turbo 240, EIZO Blue reduction, LG Motion 240 , LG Blur Reduction and Samsung) on some of those models; these were usually close in cost to the corresponding nVidia G-Sync models from same manufacturers. However, naive consumers went along believing Freesync = G-sync and never looked beyond the cost savings, why MBR matters and eschewing the more expensive Freesync models that included separate MBR technology. As a result, over time, the number of Freesync models so equipped has diminished. And again, this is just fine in the lower priced segments ... if you are producing 30 - 75 fps in the games you play, you have little to gain from a G-Sync monitor. So where does it matter ?
Again, G-sync adaptive sync and ULMB are mutually exclusive .... You can not use both at the same time. If you have G-Sync on and want to use MRR technology,
you have to TURN G-SYNC OFF in order to use ULMB. To use G-Sync you have to turn off ULMB. In practice .... the point at which you might like to use ULMB instead of G-Sync is between 60 and 80 fps.
Say for example, I have a MSI RX 2080 OC'd to match TPU's test results.
-At 1080p, all but 1 game will see over 100 fps .... and the last is 88.7 ... wouldn't use adaptive sync here at all ... this is a territory where most usersd will prefer ULMB over G-Sync / Freesync
-At 1440p, all but 6 games are over 100 fps .... and 6 between 80 and 100 ...with Ghost Recon Wildlands @ 70.8, this is a territory where most usersd will prefer ULMB over G-Sync / Freesync for all games except maybe GRW ... would recommend trying both and see what liked better.
-At 2160p, only 4 games are over 100 fps .... and 7 more between 80 and 100 ...out of the 12 left, would play 7 of them using adaptive sync. So, out of 23 games....
... 11 games => ULMB
... 5 Games => Tossup between ULMB and Adaptive sync
... 7 Games => G-sync / Freesync
In short .... for, you're in a better position to be sitting w/ a G-Sync monitor for 16 of the 23 games ... For the other 7, G-Sync / Freesync wouldn't matter ... however in those games AMD ' best card is running 4 - 12 fps behind amobe those 38 - 60 fps games.
Again, obviously this example is in a price / performance niche in which AMD doesn't compete well .... hopefully this will change in the fall. But as we get down to the 1650 / 570 level .... this is a "whole 'nother ball game" ... most games are going to be in that 30 - 70 fps range in which case whether or not you can turn off adaptive sync and use MBR technology is not a realistic option.
In summary ... it's hard to make an argument that G-Sync / Freesync decision matters in the lower budget ranges. Unless you can consistently stay above 60 fps at your resolution it doesn't. However, if you are selecting a monitor capable of 100 MHz or more ... and a card that can drive ya system up around 75 fps half the time or more ... then it's hard for me to see current AMD Freesync options as an attractive solution.