• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA may be doing something nefarious with "G-SYNC Compatible"

I have yet to come across a for profit company, that everything it does, is not for its own profit and interest.

Take the case at hand, Freesync is a free to use technology. By definition it does not profit AMD in any way.
 
A sensational clickbait title.
I'm open to suggestions.

the page says
View attachment 120511

but the specs say,in small print

View attachment 120512


so a 144hz freesync monitor-technically that's true,but can't have 144 and freesync at the same time.


nvidia mentions adaptive sync in addition to g-sync.is nvidia even allowed to market this as freesync ?
The panel is 144 Hz fixed refresh rate (Asus's number) or 35-90 Hz variable refresh rate (verified by AMD). Why that is, I don't know.


Isn't Freesync and Adaptive Sync differing standards?
The former being implemented by AMD, the latter an industry standard.
FreeSync is AMD's implementation of the VESA Adaptive Sync standard.
G-SYNC Compatible is NVIDIA's implementation of the VESA Adaptive Sync standard.

haven't those aoc panels been named freesync compatible before ? if so,it'll work.
The only explanation I can think of is that there is a revision in the panel and they haven't gotten it verified by AMD. Even if this is the case, it is still disingenuous because they're selling the new panel under the old panel's model number which misleads consumers because it appears on both lists.

also,isn't it your first thought that the retailer is trying to confuse the buyer,but rather nvidia is forcing the entire industry to remove freesync branding ?
That's what we don't know. Either way, it's a problem.
 
Take the case at hand, Freesync is a free to use technology. By definition it does not profit AMD in any way.
really ? does not profit amd ?
cause I thought when Vega launched they made it all about freesync.
 
AMD doesn't collect any fees, royalties, nor payments in relation to FreeSync. It costs AMD time (and time is money) to certify monitors which they do on their own dime as a service to display manufacturers. The majority of the risk/cost is on the display manufacturers.
 
Take the case at hand, Freesync is a free to use technology. By definition it does not profit AMD in any way.

Are you serious?!?!?! Increased popularity of Freesync helps AMD sell their video cards! You need an AMD video card to fully utilize Freesync!

AMD doesn't collect any fees, royalties, nor payments in relation to FreeSync. It costs AMD time (and time is money) to certify monitors which they do on their own dime as a service to display manufacturers. The majority of the risk/cost is on the display manufacturers.

It helps AMD sell their video cards!
 
FreeSync is AMD's implementation of the VESA Adaptive Sync standard.
G-SYNC Compatible is NVIDIA's implementation of the VESA Adaptive Sync standard.
I obviously worded my question wrongly, as I didn't mean to say they are different standards, but different names for basically the same thing.
It's my understanding AMD developed Freesync, but the industry standard is Adaptive sync, which AMD are happy to back and not worry about enforcing the Freesync branding on, but clearly Nvidia has issues with and wants to enforce their G-sync branding despite their monitors being Adaptive sync capable.
Either way the marketing should be transparent and state it is capable of both in advertising of monitors as there are still those willing to flog the G-sync horse despite it dying a slow death.
 
Increased popularity of Freesync helps AMD sell their video cards

Looking at their market share, it seems like it didn't help them do shit. But it would be hard to quantify this regardless.

You need an AMD video card to fully utilize Freesync!

No you don't. AMD had, for the time being, the only cards that could do it however that's simply because no one else supported it. Now there is also Nvidia and it's not like there are a million choices are there ?

At the end of the day this is a technology that even their competitors can implement helping them sell cards too , right? How does AMD profit from that huh ?
 
Last edited:
Are you serious?!?!?! Increased popularity of Freesync helps AMD sell their video cards! You need an AMD video card to fully utilize Freesync!
And yet Nvidia is now benefitting from it, how generous nice of them!
 
In case you don't know, the evil empire nvidia has a competing technology called G-sync. Freesync was created to counter that, and to help AMD sells video cards AT THAT TIME!

That nvidia cards can now work with SOME Freesync monitors is a recent development. Things change NOW. But initially, Freesync ONLY works with AMD video cards!

Looking at their market share, it seems like it didn't help them do shit. But it would be hard to quantify this regardless.

My argument then is, without Freesync helping AMD selling video cards, AMD's market share would be even lower than now. But then, like you SAID, how do you quantify that, huh?
 
In case you don't know, the evil empire nvidia

You know what, you are clearly trying to steer this conversations into some sort of fanboy shitshow so with this "evil Nvidia this, evil Nvidia that" mocking crap. Believe what you want.
 
Are you serious?!?!?! Increased popularity of Freesync helps AMD sell their video cards! You need an AMD video card to fully utilize Freesync!



It helps AMD sell their video cards!
You don't Need a AMD video card, that's the point of open standard's.

Your talking about initial adoption, of course the first manufacturers had a advantage , what with being the first company to release to consumers a free standard for wired VRR , obviously they were the only purveyors ,but anyone else can make something work to it.
 
You know what, you are clearly trying to steer this conversations into some sort of fanboy shitshow so with this "evil Nvidia this, evil Nvidia that" mocking crap. Believe what you want.

Because the title of the thread is a flame war clickbait, and I expect AMD fanboys to come out of the woodwork to say, "yeah, I know it, nvidia is planning something evil to steal our money!"
 
Because the title of the thread is a flame war clickbait, and I expect AMD fanboys to come out of the woodwork to say, "yeah, I know it, nvidia is planning something evil to steal our money!"
Yet it's you pedaling that line , most are saying it could be but this isn't enough proof.

Your not adding much by calling people talking about it Fanboys or putting bullshit no one is saying down as what they're saying, try reading.
 
Because the title of the thread is a flame war clickbait, and I expect AMD fanboys to come out of the woodwork to say, "yeah, I know it, nvidia is planning something evil to steal our money!"

Seems to me You swallowed the bait :roll:

Please excuse people for thinking NVIDIA might be trying pull something like GPP a year ago...
 
Yet it's you pedaling that line , most are saying it could be but this isn't enough proof.

Your not adding much by calling people talking about it Fanboys or putting bullshit no one is saying down as what they're saying, try reading.

No, you try reading. Read my first post - it was to counter the statement that everything nvidia does is for its own profit and interest. I merely stated that is true for every for profit organization. nvidia is no exception.
 
That's true for every money maker, however when greed exceeds a certain threshold to hurt consumers - that's when you need to take issue with such practices. Right now there's not "enough" evidence to establish anything, however if G-Sync compatible "branding" buries freesync then it hurts the consumers. How is anyone supposed to know if a G-Sync "compatible" monitor also works with freesync/AMD GPUs - do you suppose people should just buy them & roll the dice :wtf:
 
Seems to me You swallowed the bait :roll:

Please excuse people for thinking NVIDIA might be trying pull something like GPP a year ago...
Nvidia yes is trying to make money its cause they are a for profit company, but they are also with certifying monitors is make sure the monitor will work as they wanted it to like with g-sync from day 1. A lot of people forget when g-sync first came out, it worked near as intended. Where as freesync monitors had ton of issues like ghosting and other issues as panels that would be used were cheapest bottom of bin panels. Nvidia did a lot of R&D work including testing dozens or even 100's of panels to see which ones worked well with the tech. For me they are more dedicated to having it work right from start rather then just slapping their name on it and hoping for the best. Freesync has come a long way from early days but saying nvidia is doing somethin nefarious cause they want the say if their logo is used to say this monitor works as intended with the monitor is nefarious you have a very questionable definition of it. VRR being an open standard, each company is gonna implement it in their own way for which might work fine on 1 companies cards but not on another. Guess which side is gonna be left dealing with the fixes for monitor companies screw up? Exactly it will be left up to the gpu maker to fix someone else's screw up.
 
I agree and this should perhaps be the main point.
Is Nvidia twisting monitor makers arms to remove any AMD labeling, well the evidence is slim but going off it being Nvidia i can't help but expect more.

But as you say ,regardless if the above Nvidia has made a complete mess of it's own standard in that G-sync is not the same as this new G-sync (freesync ok n tech) compatible.
And it's going to take being fully in the know about tech to buy the version your after , and likely some double checking of spec's.
Average joe's have no chance of understanding what's going on with all these versions of variable refresh.

You might very well be on to Nvidias tactic with this Gsync affair. One of the first rules in controlling the public (opinion) is: create confusion.
 
I obviously worded my question wrongly, as I didn't mean to say they are different standards, but different names for basically the same thing.
It's my understanding AMD developed Freesync, but the industry standard is Adaptive sync, which AMD are happy to back and not worry about enforcing the Freesync branding on, but clearly Nvidia has issues with and wants to enforce their G-sync branding despite their monitors being Adaptive sync capable.
Either way the marketing should be transparent and state it is capable of both in advertising of monitors as there are still those willing to flog the G-sync horse despite it dying a slow death.
I think the reason why AMD created FreeSync is because they saw early on that adaptive sync wasn't a cut and dry thing like, say, USB. Monitors are relatively dumb devices so to make adaptive sync work, the smart device (GPU) has to give the monitor exactly what it expects when it expects it. Because those tweaks are AMD's alone, they had reason to differentiate their optimizations from everyone else's; hence, FreeSync was trademarked. G-SYNC Compatible was an inevitability for the same reason: NVIDIA needs a way to differentiate their optimizations from AMD's.

There's nothing inherently wrong with G-SYNC Compatible nor FreeSync. The problem AOC presents is that perhaps G-SYNC Compatible licensing language may require exclusivity. Just like how NVIDIA required that ROG couldn't include AMD GPUs, NVIDIA may be requiring G-SYNC Compatible models to not also have FreeSync branding.
 
Last edited:
Nvidia yes is trying to make money its cause they are a for profit company, but they are also with certifying monitors is make sure the monitor will work as they wanted it to like with g-sync from day 1. A lot of people forget when g-sync first came out, it worked near as intended. Where as freesync monitors had ton of issues like ghosting and other issues as panels that would be used were cheapest bottom of bin panels. Nvidia did a lot of R&D work including testing dozens or even 100's of panels to see which ones worked well with the tech. For me they are more dedicated to having it work right from start rather then just slapping their name on it and hoping for the best. Freesync has come a long way from early days but saying nvidia is doing somethin nefarious cause they want the say if their logo is used to say this monitor works as intended with the monitor is nefarious you have a very questionable definition of it. VRR being an open standard, each company is gonna implement it in their own way for which might work fine on 1 companies cards but not on another. Guess which side is gonna be left dealing with the fixes for monitor companies screw up? Exactly it will be left up to the gpu maker to fix someone else's screw up.

You are right. Nvidia has worked hard, they have the budget and manpower and it shows. But they are also an aggressive company in a dominant position in the market , so I wonder what can we expect?
My guess is they'll hijack all premium brands like Predator, Agon, ROG ect. and those will become G-sync compatible only, no freesync branding, no mention in the specs. Will they work with AMD cards? Spend 500 dollars to find out. And they could try to go further than that, but Intel will also enter the market so I'm not worried, GPP stoped when Intel's partners said no!

(edited for quote)
 
Well that's what 82% vs 18% market share will get you. NVIDIA basically admitted g-sync defeat by adopting open standard, and in the end their branding will win out because of install base, every manufacturer is gonna plaster G-sync compatible like a badge of honor, NVIDIA wont even have to push that hard.

They really only admitted defeat with the required module, and even then, not really because the best form of G-Sync still uses the module.

The problem was, like I said, AMD would slap a Freesync sticker on pretty much anything. There wasn't really any quality control there, while nVidia is still implementing at least some levels of quality control to their branding.

Isn't Freesync and Adaptive Sync differing standards?
The former being implemented by AMD, the latter an industry standard.

No, FreeSync is just AMD's rebranding and implementation of the already free adaptive sync standard rolled out by the Displayport standard and eventually the HDMI standard too.

AMD doesn't collect any fees, royalties, nor payments in relation to FreeSync. It costs AMD time (and time is money) to certify monitors which they do on their own dime as a service to display manufacturers. The majority of the risk/cost is on the display manufacturers.

Yeah, but they get to say their cards support a technology that sort of competes with nVidia's "premium" G-Sync feature. It is one of those "look as us, we can do that too, please buy our products" tactics. And they definitely profited from it.
 
Last edited:
They really only admitted defeat with the required module, and even then, not really because the best form of G-Sync still uses the module.

The problem was, like I said, AMD would slap a Freesync sticker on pretty much anything. There wasn't really any quality control there, while nVidia is still implementing at least some levels of quality control to their branding.


Nvidia lost two things with adoption of the open standard, profits from modules, and more importantly exclusivity. It was a hard pill to swallow, let me quote Jensen here:

"We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards."

You're right, they didn't admit defeat. :)

Freesync was a bit of a mess with the bad panels and no real standard, but it was, and is a lot cheaper, just an added value for the display and not the third of the cost.
 
They really only admitted defeat with the required module, and even then, not really because the best form of G-Sync still uses the module.

The problem was, like I said, AMD would slap a Freesync sticker on pretty much anything. There wasn't really any quality control there, while nVidia is still implementing at least some levels of quality control to their branding.



No, FreeSync is just AMD's rebranding and implementation of the already free adaptive sync standard rolled out by the Displayport standard and eventually the HDMI standard too.



Yeah, but they get to say their cards support a technology that sort of competes with nVidia's "premium" G-Sync feature. It is one of those "look as us, we can do that too, please buy our products" tactics. And they definitely profited from it.
It got rolled out because Amd developed it then submitted it to the vesa and hdmi standard's groups for inclusion in standards.
Nvidia did premium Gsync first and inssisted at the time that monitors had to have special circuits to do it. Despite them providing the gpu for laptops which already had a version if it.

I suppose it's how you say it though.
 
Remember that anything Nvidia has ever done has been for their own profit and interest, no exception. This is why they have waited so much time before enabling FreeSync support, they are now trying to leverage the whole market their way. To think that people believed this was just them being nice.

Better said ... "Remember that anything any corporation has ever done has been for their own profit and interest, no exception. Corporation officers are required, by law, to act in the best interests of their stockholders,,, within the law of course. Failure to to so is cause for legal action, punishment and dismissal. Nvidia making allowances for people with Free-Sync monitors is the proverbial "no brainer". They own the top 5 tiers .... outside die hard loyalists who will ignore the numbers, people comparing cards in each market proce niche based upon price, performance, OC ability, power, temperature and noise, can only come away with an nVidia choice. Years back, with no competition at the top end, nVidia was competing with itself when grabbing 2 lower tier cards in SLI instead of the top tier card because of the higher performance / price ratio, nVidia was losing x80 / x80 Ti sales to lower tiered nVidia cards. Now the biggest obstacle they have to increased sales is folks who bought a Freesysnc monitor. By removing that obstacle, nVidia gains and AMD loses.

This stinks like GeForce Partner Program (GPP). Consult the lists above rather than looking at marketing material that may have been skewed in favor of NVIDIA.

The spin doctors did a great job on this one. It's no secret that nVidia has clamped down on this one. The consumer is the one that lost here. It's no secret that since Boost3 arrived, nVidia has locked out much of the performance that could be gained from overclocking. So nVidia remains committed to limiting how much cards can do because why buy a Ti when a x80 OC'd to the max gives you all you can use ? Not like there's another option at that performance level. So nVidia came up with an idea .... I imagine the board room conversation going something like this:

Tech Dude - "We could gain a lot more performance with a few minor PCB and cooling performance. Why don't we allow our AIB partners to add them and when the drivers detect these cards, we can allow loosened restraints on the cards ... say power slider goes from 20% to 35 % and voltage goes ...."

Marketing Dude - "Let me interrupt a second .... let's say we do this and and make a deal with MSI for example to give them more overclocking headroom for their new MSI "Thor's Hammer 2080 Ti" card ... we help fund a new joint marketing campaign detailing this cooperative initiative to bring levels of performance well above what is otherwise available to "non partners". So Johnny reads all the reviews of the 2080 Ti on TPU and makes his XMas Wish List from the top 3 2080 Tis that TPU has reviewed. Now Mom fires up google... she finds the prices as listed :

1. MSI Thor's Hammer - 255 fps in TPUs OC Test ( ($1,550)
2. MSI Lightning Z - 237 fps ($1,450)
3. MSI Gaming X Trio - 227 fps ($1,350)
4. EVGA FTW - 225 fps ($1,500)
5. Asus Strix - 225 fps ($1,370)


Mom's may be rich but she has any eye for a bargain so she's leaning towards the Strix ,,, decides to do one more search on MSI Thor's Hammer and find the MSI Thor's Hammer Vega 64 for $850 .... Mom is beside herself she found her darling's No. 1 desires XMas present and for half the price ! She's not away that it delivers less than half the fps of any of the cards on the list. Or forget the mom, take ant normal guy / gal who isn't a PC enthusiast, building a box and who remebers seeing the headline of an article ... "Thor's Hammer" sets record on performance"or that he remembers the nerd on his floor in the college dorm proclaiming the Thors Hammer something something (cupla numbers and letters) was da bomb.

So why should we invest our time, money and effort into R&D. poduct agreements, driver software re-engineering, marketing etc etc if AMD or MSI can slap the "Thor's Hammer" name on any product they choose ?

Tech Dude - "Uhhh, I can't answer that boss"

A basic marketing course will cover the importance of branding in the 1st few weeks and one of the basis tenets of evaulating branding strategy is "Does the brand share the uniqueness of what I am offering and why it's important ?" So yes, to invest significant finacial resources offer a unique set of performance options by stepping up the technology to partners to create a "brand", anyone would be foolish to allow the competition to allow the "brand" to be used by anyone, let alone a competitor. You don't see Dodge allowing "Ram" being used by Ford and General Motors .... We even have Taylor Swift countersuing "Swiftlife Computer Services " ... after she came out w/ her app "Swiftlife" 10 years after the firm copyrighted it. The same occurred when she infringed on "Lucky 13". Branding has value .... that's why there are so many suits over infringement.


As to the two versions of the technology being the same ... that's both true and not true. Unless updates have changed things....

- Freesync is an adaptive sync technology that has a functional range of usefulness beginning at about 40 fps. The impact on the user experience begins to tail off above 60 fps.

- G-sync is an adaptive sync technology that has a functional range of usefulness beginning at about 30 fps. The impact on the user experience begins to tail off above 60 fps. Where the solutions differ is that G-Sync includes a hardware module which provides Motion Blur Reduction (MBR) technology.

As fps increases more and more past 60 fps, the impact on the user experiences favors nVidia's motion blur reduction technology (ULMB) over nVidias adaptive sync. Users with high refresh rate monitors (120 fps+) will tend to favor MBR technology alternative to adaptive sync somewhere between 70 and 90 fps. Some Freesync compatible monitors have their own manufacturer specific MBR technologies but these vary between manufacturers.

Why the heck is G-Sync compatible even a thing now? With Nvidia supporting adaptive sync are these manufacturers still paying the G-Sync tax? Is JHH lying as he did with that BS "freesync doesn't work" theory :shadedshu:

I obviously worded my question wrongly, as I didn't mean to say they are different standards, but different names for basically the same thing.

You misunderstand what G-Sync is

G-Sync = Adaptive sync + Hardware module for ULMB. The hardware module costs money and is a standardized nVidia design.
Freesync = Adaptive Sync + Nothing. Some manufacturers add a MBR hardware module which also costs money. See list here:

https://www.blurbusters.com/freesync/list-of-freesync-monitors/

The tax as you call it is the cost of the MBR hardware module, if you get it, you pay for it regardless of whether it's Freesync with monitor manufacturerd hardware module or G-Sync w/ nVidia manufacturers hardware module. To my eyes ....

a) If you're gaming at 40 - 80 fps in all ya games and are using a card selling today at < $280, Id recommend a Freesync 60 - 75 Hz monitor. You are not going to be using MBR
b) If you're gaming at 30 - 100+ fps in all ya games and are using a card selling today at > $280 I'd recommend a G-Sync monitor. Use G-Sync in games in which you get up to 70 - 80 fps. Turn G-Sync off and use ULMB if you routinely getting over 70 - 80 fps.

http://www.tftcentral.co.uk/articles/variable_refresh.htm

"Both G-sync and FreeSync operate on this principle of dynamically controlling [adaptive sync] the refresh rate. There are a few differences between how the technology is implemented though. NVIDIA G-sync requires a proprietary G-sync module to be added to the monitor, which comes at quite a high cost premium.... G-sync modules also support a native blur reduction mode dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe backlight system if they want, in order to reduce perceived motion blur in gaming. It cannot be used at the same time as G-sync since ULMB operates at a fixed refresh rate only, but it's a useful extra option for these gaming screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works with specific G-sync compatible NVIDIA graphics cards. While you can still use a G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the actual G-sync or ULMB functions. .... There is no native blur reduction mode coupled with FreeSync support so it is down to the display manufacturer whether they add an extra blur reduction method themselves. "
 
Last edited:
Remember this thread, how's about someone picks at this news thread , actually gets some proof and posts a valid story about Nvidia being devious.
This IS happening , surely proof would be easy to find, the tuber's got involved a while back so it might mend some bridges.

And puts the limelight on Nvidia not btarunner or TPU where it belongs , the anti consumer company , Nvidia.
 
Back
Top