• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

LG OLED TVs Receive NVIDIA G-SYNC Upgrade Starting This Week

False marketing by definition is what they are doing. Calling AMD's freesync tech a kind of G-sync, especially when G-sync needed a module to work properly and now they call the adaptive-sync a mode of G-sync is deceiving marketing beyond any limit. I think they should be called out for that practice by all tech community. They milked customers with the G-sync initially and when freesync was proved to work well and took over the TV market they changed strategy and made their drivers to work with those monitors and TVs as well as AMD GPUs did from the start.
 
This is not FreeSync at all. As far as I know, FreeSync over HDMI is not compatible with HDMI 2.1 VRR. It is a custom implementation with HDMI extensions. So this is actually a first, that a graphics card can push VRR. Xbox One can as well, but that was the very first device of any sort to support it. So Nvidia should get credit for doing it first, and LG is namechecking Nvidia because they helped test LG's implementation of VRR and developed the driver capability alongside it. If this is anything like HDMI 2.0 HDR, then it's going to take a while before TV makers converge on an actual, working paradigm for universal support. My HDR TV had many updates to the feature, and had to be explicitly updated to fix some Xbox One teething issues where the HDR was sent in a different way than the spec. LG had issues on old OLED TVs with black crush and dim picture on HDR content.

So restricting this to just LG's TVs right now also makes sense. Because no other TV maker has the feature implemented. Samsung has FreeSync support, but it is not the same as the standard VRR. As more TV makers actually get this implemented, I'm sure the support will be extended to them.

This is not Nvidia stealing FreeSync HDMI. This is Nvidia beating AMD to the punch in supporting the standard. Everyone trying to cast this as just marketing doesn't really understand the technical hurdles of actually making this happen. I don't think Nvidia will ever support FreeSync HDMI. Future monitors will include HDMI 2.1 VRR and Nvidia will be in a place to enable it more broadly once all of the disparate engineering teams finally implement it in a universal, standards-compliant way. I hope AMD catches up soon.
 
if has no hdmi 2.1 then skip.

It has, but there's no gpu that have one. You have to wait next gen gaming cards for that... But any way I don't really know if these thing even have higher than 4K 60Hz panels to really take advantage of HDMI 2.1 bandwidth. Nothing forces manucturer to use 4k high refresh rate panel, hdmi2.1 just makes it possible.

Ok, again:
- LG 2019 OLED TVs have HDMI 2.1 with VRR support. Very much a standard thing.
- What Nvidia arranged was to add support for the same feature in their current GPU lineup that only has HDMI 2.0.
- No extra cost, no effect to the standard feature.

What exactly is here to be upset about?
What vendor lock? Deceiving how?

Yeah I see this as a test vehicle for next gen cards for full support of hdmi 2.1. VRR is _compulsory_ feature for hdmi 2.1 spec, so there's little change to restrict it from other vendors. I'm more concerned that TV maker name is LG, have had bunch of those with crazy bat shit firmwares and malfunctional EDIDs.
 
It has, but there's no gpu that have one. You have to wait next gen gaming cards for that... But any way I don't really know if these thing even have higher than 4K 60Hz panels to really take advantage of HDMI 2.1 bandwidth. Nothing forces manucturer to use 4k high refresh rate panel, hdmi2.1 just makes it possible.

Many, if not all, high end 4k TVs do use 120 Hz panels. They almost all support 1080p 120 Hz. The panel itself is therefore capable of it. This is not a CRT situation where lower resolutions unlock higher framerates from a physical perspective. The pixels can change at 120 Hz, so it is hardware supported at the panel level at 4k. The trick is the image processing for output at 4k. Many of them also have the Soap Opera Effect capabilities that can interpolate 60 Hz to 120 Hz, even 4k sources. So the internal processing can support output at 4k 120 Hz in these cases, and that's a good sign that the connection from the chipset to the LCD driver is set up to work at those high frequencies. The real missing piece is the ability to accept 4k120 over HDMI, which requires seriously high bandwidth and is not something that commodity parts for the HDMI interface can support. This is probably the limiting factor at this point. Now, it would maybe be possible to stream 4k120 video if the TV's GPU can decode at that rate. That is also not very likely for all but the most advanced chipsets, like the ones LG and Sony use. Or it might be "supported" on the white sheet but not implemented yet. And honestly, video at 120 Hz is probably pretty useless, since no major network or production company supports it through the production or transmission pipeline. Even 60 Hz streaming is relatively new for the over-the-top TV streaming services like Hulu TV.

So nothing forces TV manufacturers to use 4k high refresh rate panels, but they all do anyway. LG, Sony, Samsung, Vizio (P series at least). You can check rtings.com for the model to see if it supports 1080p120 and that will tell you what the panel supports for refresh rate. The missing part is the rest of the chain, and my guess is it's really down to the availability of commodity 2.1 support in the HDMI interface board.
 
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better

Not true if you look at display with identical panels like XB273k and XV273k. Surprisingly XV273k with Freesync has better brightness and overall picture quality.
Freesync version of same display usually have same performance just like their G-Sync counterparts. It's just g-sync module came with adaptive overdrive and usually Freesync module don't support that.

And by the way G-sync module is a FPGA so even nvidia themself didn't plan to sell that thing for such long time.

And on big thing about this news is this is first time nvidia supporting VRR via HDMI. No graphics card from nvidia support HDMI 2.1 yet, this is Freesync via HDMI 2.0. Currently only Geforce Beta driver for Windows Insiders have this ability.
 
Last edited:
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better
may wanna change that nickname to big,fat phony.

and uninformed at the same time too.freesync was never any worse than gsync when they both worked as intended.g-sync monitors have ulmb+adaptive v-sync in drivers,and that's pretty much where the advantages end.most pople are not willing to pay the premium for that,though in my experience it's pretty mind boggling,not even in shooters but particularly in parkour or melee first person games like dying light or shadow warrior.If one can afford it I'd definitely recommend for them to try.
there's low framerate compensation too,but trying to make sub-30 fps gaming better is like turbocharging a Zastava so not really of any importance.Who in their sane mind spends $500-1000 on a monitor and cares about LFC.really.
ghosting issues ? they were probably caused by the fact that you could certify any piece of crap as freesync,while for g-sync you could not.

anyway,I reported your poor attempts to get me banned for an alt account to the staff,and we have a pretty decent idea who you are too.strangely your main account activity is somehow related to the posts with the alt one.I guess the only argument you could win on TPU is with yourself :roll: :laugh:
 
Last edited:
That may be, but the 20- and 16-Series GPUs have HDMI 2.0b, which means there is something else going on here.

I wonder if there is any money involved or if it's simply a partnership meant to benefit both parties. I'd really be interested to know the answer to that, because it's possible there are some shenanigans going on here considering that AMD is more-or-less the proprietor of VRR via HDMI 2.0b and earlier versions.

With WDDM 2.7 beta driver on a Windows insider preview build, Geforce card already support VRR via HDMi 2.0b. That's not a secret. And I didn't got any NDA for that.
 
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better

1. There is no "heavy lifting" when it comes to VRR. No idea where you got that bit from but it's false.
2. Stuttering? FreeSync is not plagued by stuttering.

You make a few unfounded claims because there isn't a single tech outlet that backs up your claims.
 
It's just g-sync module came with adaptive overdrive and usually Freesync module don't support that.

And by the way G-sync module is a FPGA so even nvidia themself didn't plan to sell that thing for such long time.

What exactly is a "Freesync module"???

FPGA makes more $ense than ASIC for low volumes. Nvidia understands the ecnomics involved.

The title is kind of misleading, should be "LG OLED TV's recieve 'G-SYNC Compatible Certification'."
 
G-SYNC Compatible Now Available On LG 2019 4K OLED TVs, For Smoother, Tear-Free PC Gaming
If you don’t own a LG TV, but do own a display or TV that only supports Variable Refresh Rates via HDMI, you can try enabling HDMI VRR by installing the new Game Ready Driver, and enabling G-SYNC as detailed above. As these displays and TVs have yet to through our comprehensive validation process, we can’t guarantee VRR will work, or work without issue.
Be assured though, we will be testing current and future Variable Refresh Rate HDMI TV displays, and will be working with manufacturers like LG to bring flawless out-of-the-box G-SYNC Compatible support to their big-screen TVs, so more GeForce gamers can enjoy smooth gameplay from the comfort of their couch.

Mm, I find that very interesting and a nice addition. It's also great to hear that lower resolutions can utilise a higher refresh rate. As for all the FreeSync/G-Sync/G-Sync Compatible "debate" going on, I think I'll just continue referring to the objective analysis from BlurBusters & TFTCentral...
 
What exactly is a "Freesync module"???

FPGA makes more $ense than ASIC for low volumes. Nvidia understands the ecnomics involved.

The title is kind of misleading, should be "LG OLED TV's recieve 'G-SYNC Compatible Certification'."

Freesync module is usually part of display's controller's DSP. It's not a physical chip or something but I just need a term to describe where this "adaptive overdrive" feature goes. There's one Freesync monitor that actually support this feature.

This feature is currently the only feature that is missing on almost every Freesync display and that's it. G-SYNC didn't give them any more than this.
G-SYNC module is a total joke considers nvidia is a ASIC design company.
 
False marketing by definition is what they are doing.

Yeah, because everyone cares who developed what standard...

No, in actuality no consumer outside of here gives a shit about that. They just want to know what GPU this solution is promised to work with.

This is promised to work with nvidia, so they used nvidia branding. The fact that most vendors choose nvidia brand indicators over AMD ones is not false advertising either, they probably test it on the market leader (which, like it or not, is nvidia). They may work with AMD too, but few even consider that fact worth marketing anymore. That's a problem with AMDs brand position and generally having been poor competition for 2+ years, not a nvidia scam routine.

Playing the AMD-victim card regularly is going to become part of their brand too, if I keep reading crap like this.
 
Yeah, because everyone cares who developed what standard...

This isn't about that. There used to be G-sync and everything else which was branded as Freesync, now everything is slowly being labeled as G-sync compatible. The problem isn't that it's false advertising bur rather that AMD is getting removed for no obvious reason from marketing material making people believe that their hardware is incompatible.

It's irrelevant who's brand is in a better position, when a company uses their brands they do it on the basis of a partnership that entails certain restrictions. And thus we get to the real reason why this is a problem, Nvidia is clearly doing something which makes manufactures want to avoid mentioning AMD.

You know, a year ago Nvidia was also a market leader and AMD in an arguably worse position and yet manufactures where happily slapping AMD/Freesync on everything, surely there is more to it than just brand recognition.
 
rather that AMD is getting removed for no obvious reason from marketing material making people believe that their hardware is incompatible.

The "no obvious reason" is simply that there is no reason to advertise AMD compatability for the limited marketshare they represent. This occured the moment nvidia opened the gsync brand to non gysnc module monitors.

If you want to blame anyone, blame the vendors, and capitalism maybe.
 
The "no obvious reason" is simply that there is no reason to advertise AMD compatability for the limited marketshare they represent.

Really ? Let's see.

AMD has limited market share yet their brand manages to make it's way to a lot of manufactures. (This is pre G-sync compatible)

Nvidia enables compatibility on non G-sync displays, AMD's brand starts to get removed from existing and new products.

Potential customers with AMD hardware didn't vanish and might have even grown in numbers so why would a vendor stop advertising something which they previously did with no problem ? This cannot be explained with market share, in fact it makes absolutely no sense, both AMD and Nvidia can help extend their reach. There is no reason to exclude one or the other.

A vendor would always want to list as many compatible products as possible, not just pick one because it has the highest market share. This is of course if nothing constricts them.
 
Last edited:
  • Like
Reactions: HTC
Potential customers with AMD hardware didn't vanish and might have even grown in numbers so why would a vendor stop advertising something which they previously did with no problem ? This cannot be explained with market share, in fact it makes absolutely no sense, both AMD and Nvidia can help extend their reach. There is no reason to exclude one or the other.

You can extend that to several brands maybe, but not to the article or monitor/TV we're talking about as this has never had freesync compatability in any form, marketed or otherwise.

That was my point.

Even then, if one has to choose between advertising freesync or gsync, the market choice is obvious. Of course until I see evidence of such an agreement existing I won't even begin to comment on the theoretical anticompetitive nature of it.
 
Last edited:
Yeah, because everyone cares who developed what standard...

No, in actuality no consumer outside of here gives a shit about that. They just want to know what GPU this solution is promised to work with.

This is promised to work with nvidia, so they used nvidia branding. The fact that most vendors choose nvidia brand indicators over AMD ones is not false advertising either, they probably test it on the market leader (which, like it or not, is nvidia). They may work with AMD too, but few even consider that fact worth marketing anymore. That's a problem with AMDs brand position and generally having been poor competition for 2+ years, not a nvidia scam routine.

Playing the AMD-victim card regularly is going to become part of their brand too, if I keep reading crap like this.
Renaming and taking advantage of a free tech beneficial for gamers when made by another company in order to sell more of your products, especially when you milked that specific market with the G-sync modules sold inside monitors for years is clearly false marketing and unethical by all means. Everyone is free to believe and think anything as true but logic and facts are above all thoughts and ideas. End of discussion from my side.
 
Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.
 
You can extend that to several brands maybe, but not to the article or monitor/TV we're talking about as this has never had freesync compatability in any form, marketed or otherwise.

That was my point.

Even then, if one has to choose between advertising freesync or gsync, the market choice is obvious. Of course until I see evidence of such an agreement existing I won't even begin to comment on the theoretical anticompetitive nature of it.

And to add to that: GSyng compatible badge is not something monitor/TV manufacturer can just slap on the monitor anyway, Nvidia does do validation process all for those too(albeit not quite strict that it does for GSync and Gsync ultimate monitors).

Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.

Again it's forward looking driver support implementation of HDMI 2.1 VRR. All Turing cards that have HDMI video output supports now compulsory HDMI 2.1 VRR, albeit they are not hdmi 2.1 cards. So if your TV has hdmi2.1 output, you can force GSync compatibility from the nvidia control panel.
 
And to add to that: GSyng compatible badge is not something monitor/TV manufacturer can just slap on the monitor anyway, Nvidia does do validation process all for those too(albeit not quite strict that it does for GSync and Gsync ultimate monitors).

Indeed, if anything G-Sync compatible helps weed out the good Freesync displays from the trash.
 
Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.
you are joking? everyone i know uses 43+" tv's plugged to pc, i use 55"nu8000/65"nu8000 and now i will upgrade to the oled LG or a short throw laser projector once the price drops a little and 2.1hdmi comes out fully, im so excited for it cant wait for the 3080ti or what ever will be next, been using a 1080ti for so long it makes me crazy.

if you waste your money on a gaming 27" or smaller shame on you.

funny most people say "oh the 55" doesn't fit my allotted space" well then first thing you need to do is focus on getting a bigger room! then a bigger TV then have a better experience ;P went from 21" trinitron to a LG 37" to a 43"4k to a 55"1440@120hz samsung tv / 65"depending on the room. even my mom uses a 4k 43+" tv for her desktop. welcome to 5+ years ago :)
 
So, LG OLED gets VRR? This better not be locked to one camp...
 
Low quality post by killster1
Low quality post by R-T-B
i wonder if AMD cares about you, you already bought their cards and are in love with them why should they try to fix their drivers :p :p :p :love:

Irrelevant to the discussion and basically just baiting. Don't do that please.
 
Back
Top