Remember that anything Nvidia has ever done has been for their own profit and interest, no exception. This is why they have waited so much time before enabling FreeSync support, they are now trying to leverage the whole market their way. To think that people believed this was just them being nice.
Better said ... "Remember that anything
any corporation has ever done has been for their own profit and interest, no exception. Corporation officers are required, by law, to act in the best interests of their stockholders,,, within the law of course. Failure to to so is cause for legal action, punishment and dismissal. Nvidia making allowances for people with Free-Sync monitors is the proverbial "no brainer". They own the top 5 tiers .... outside die hard loyalists who will ignore the numbers, people comparing cards in each market proce niche based upon price, performance, OC ability, power, temperature and noise, can only come away with an nVidia choice. Years back, with no competition at the top end, nVidia was competing with itself when grabbing 2 lower tier cards in SLI instead of the top tier card because of the higher performance / price ratio, nVidia was losing x80 / x80 Ti sales to lower tiered nVidia cards. Now the biggest obstacle they have to increased sales is folks who bought a Freesysnc monitor. By removing that obstacle, nVidia gains and AMD loses.
This stinks like
GeForce Partner Program (GPP). Consult the lists above rather than looking at marketing material that may have been skewed in favor of NVIDIA.
The spin doctors did a great job on this one. It's no secret that nVidia has clamped down on this one. The consumer is the one that lost here. It's no secret that since Boost3 arrived, nVidia has locked out much of the performance that could be gained from overclocking. So nVidia remains committed to limiting how much cards can do because why buy a Ti when a x80 OC'd to the max gives you all you can use ? Not like there's another option at that performance level. So nVidia came up with an idea .... I imagine the board room conversation going something like this:
Tech Dude -
"We could gain a lot more performance with a few minor PCB and cooling performance. Why don't we allow our AIB partners to add them and when the drivers detect these cards, we can allow loosened restraints on the cards ... say power slider goes from 20% to 35 % and voltage goes ...."
Marketing Dude -
"Let me interrupt a second .... let's say we do this and and make a deal with MSI for example to give them more overclocking headroom for their new MSI "Thor's Hammer 2080 Ti" card ... we help fund a new joint marketing campaign detailing this cooperative initiative to bring levels of performance well above what is otherwise available to "non partners". So Johnny reads all the reviews of the 2080 Ti on TPU and makes his XMas Wish List from the top 3 2080 Tis that TPU has reviewed. Now Mom fires up google... she finds the prices as listed :
1. MSI Thor's Hammer - 255 fps in TPUs OC Test ( ($1,550)
2. MSI Lightning Z - 237 fps ($1,450)
3. MSI Gaming X Trio - 227 fps ($1,350)
4. EVGA FTW - 225 fps ($1,500)
5. Asus Strix - 225 fps ($1,370)
Mom's may be rich but she has any eye for a bargain so she's leaning towards the Strix ,,, decides to do one more search on MSI Thor's Hammer and find the MSI Thor's Hammer Vega 64 for $850 .... Mom is beside herself she found her darling's No. 1 desires XMas present and for half the price ! She's not away that it delivers less than half the fps of any of the cards on the list. Or forget the mom, take ant normal guy / gal who isn't a PC enthusiast, building a box and who remebers seeing the headline of an article ... "Thor's Hammer" sets record on performance"or that he remembers the nerd on his floor in the college dorm proclaiming the Thors Hammer something something (cupla numbers and letters) was da bomb.
So why should we invest our time, money and effort into R&D. poduct agreements, driver software re-engineering, marketing etc etc if AMD or MSI can slap the "Thor's Hammer" name on any product they choose ?
Tech Dude -
"Uhhh, I can't answer that boss"
A basic marketing course will cover the importance of branding in the 1st few weeks and one of the basis tenets of evaulating branding strategy is "Does the brand share the uniqueness of what I am offering and why it's important ?" So yes, to invest significant finacial resources offer a unique set of performance options by stepping up the technology to partners to create a "brand", anyone would be foolish to allow the competition to allow the "brand" to be used by anyone, let alone a competitor. You don't see Dodge allowing "Ram" being used by Ford and General Motors .... We even have Taylor Swift countersuing "Swiftlife Computer Services " ... after she came out w/ her app "Swiftlife" 10 years after the firm copyrighted it. The same occurred when she infringed on "Lucky 13". Branding has value .... that's why there are so many suits over infringement.
As to the two versions of the technology being the same ... that's both true and not true. Unless updates have changed things....
- Freesync is an adaptive sync technology that has a functional range of usefulness beginning at about 40 fps. The impact on the user experience begins to tail off above 60 fps.
- G-sync is an adaptive sync technology that has a functional range of usefulness beginning at about 30 fps. The impact on the user experience begins to tail off above 60 fps. Where the solutions differ is that G-Sync includes a hardware module which provides Motion Blur Reduction (MBR) technology.
As fps increases more and more past 60 fps, the impact on the user experiences favors nVidia's motion blur reduction technology (ULMB) over nVidias adaptive sync. Users with high refresh rate monitors (120 fps+) will tend to favor MBR technology alternative to adaptive sync somewhere between 70 and 90 fps. Some Freesync compatible monitors have their own manufacturer specific MBR technologies but these vary between manufacturers.
Why the heck is G-Sync compatible even a thing now? With Nvidia supporting adaptive sync are these manufacturers still paying the G-Sync tax? Is JHH lying as he did with that BS "freesync doesn't work" theory
I obviously worded my question wrongly, as I didn't mean to say they are different standards, but different names for basically the same thing.
You misunderstand what G-Sync is
G-Sync = Adaptive sync + Hardware module for ULMB. The hardware module costs money and is a standardized nVidia design.
Freesync = Adaptive Sync + Nothing. Some manufacturers add a MBR hardware module which also costs money. See list here:
https://www.blurbusters.com/freesync/list-of-freesync-monitors/
The tax as you call it is the cost of the MBR hardware module, if you get it, you pay for it regardless of whether it's Freesync with monitor manufacturerd hardware module or G-Sync w/ nVidia manufacturers hardware module. To my eyes ....
a) If you're gaming at 40 - 80 fps in all ya games and are using a card selling today at < $280, Id recommend a Freesync 60 - 75 Hz monitor. You are not going to be using MBR
b) If you're gaming at 30 - 100+ fps in all ya games and are using a card selling today at > $280 I'd recommend a G-Sync monitor. Use G-Sync in games in which you get up to 70 - 80 fps.
Turn G-Sync off and use ULMB if you routinely getting over 70 - 80 fps.
http://www.tftcentral.co.uk/articles/variable_refresh.htm
"Both G-sync and FreeSync operate on this principle of dynamically controlling [adaptive sync] the refresh rate. There are a few differences between how the technology is implemented though. NVIDIA G-sync requires a proprietary G-sync module to be added to the monitor, which comes at quite a high cost premium....
G-sync modules also support a native blur reduction mode dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for these gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions. ....
There is no native blur reduction mode coupled with FreeSync support so it is down to the display manufacturer whether they add an extra blur reduction method themselves. "