I don't know if it's that people don't understand or just don't want to understand. 99% of folks are still preaching that having a G-Sync monitot is the same as having a Freesync monitor. Why ? It's obviously not true and yet the myth persists. G-Sync includes a (not free) hardware module which provides Motion Blur reduction technology (ULMB) . When ya have the GFX horsepower to get you to 70+ fps, more often than not your are going to be better off turning off G-Sync and using ULMB. That option is not available on Freesync monitors but is sometimes offered, with varying success, by monitor vendors. Because these often are just as expensive as G-Sync monitors, thay haven't been big sellers and numbers have diminished. If ya want to see what motion blue reduction does, look at this:
http://frames-per-second.appspot.com/
AMD was wise to try and take advantage of this with thier Card + Monitor combos to provide the appearance of more value by suggesting that users could maybe not quite get the same performance but where certainly getting a much better value .... but it's 'not the same" if you are not getting MBR technology. I have no issue with folks going in eyes open saying that "well i won't use MBR tech anyway as Im only looking for 45-60 fps" .. that's fine. But if most games are delivering 70 fps or better, it's not the same and most users are unaware of the distinction. If you used both at 70-100 fps and don't see the value in the extra investment ? ... gain fine, all I am saying is be aware of the capabilities and make an infoimed decision.
As for PhysX, I have to agree with THG when they said essentially that if you have it, it's not like you are going to turn it off. I never saw a significant fps impact but it can be a strain on lower end cards. But the mindset that "proprietary is bad" is misplaced. If GM invents a new engine technology, spending billions to bring it to market, do you think they should be obligated to "share it with Ford" ? To suggest so is suggesting that innovation die. Why would anyone invest millions in R & D if they are obligated to share it ? Let each player in the market do what they can to improve things and let the chips fall where they may here:
http://store.steampowered.com/hwsurvey
When PhysX broke, I frankly was not impressed with the initial demo videos until the split screen batman one appeared. Then it became painfully obvious in the cape, explosion, curtains, debris and other effects. Once seen, it could not be unseen and you were not going to turn it off if you had the hardware.
Until we see such a demonstration, no one is really in a position to judge just how well it will work. While ya could assume now "it must be better" or "it won't mean beans" , we really have no basis yto make an informed decision at this point. GFX horsepower spent here on RT, may mean that there's not enough HP left for other stuff. The proof as the saying goes is in the pudding and until the pudding is served, there's not much to comment on. Even then, the technique will be in its infancy and it will be years before we see full implementation assuming it "sticks"