• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

LG OLED TVs Receive NVIDIA G-SYNC Upgrade Starting This Week

Joined
Apr 30, 2011
Messages
2,652 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
False marketing by definition is what they are doing. Calling AMD's freesync tech a kind of G-sync, especially when G-sync needed a module to work properly and now they call the adaptive-sync a mode of G-sync is deceiving marketing beyond any limit. I think they should be called out for that practice by all tech community. They milked customers with the G-sync initially and when freesync was proved to work well and took over the TV market they changed strategy and made their drivers to work with those monitors and TVs as well as AMD GPUs did from the start.
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
This is not FreeSync at all. As far as I know, FreeSync over HDMI is not compatible with HDMI 2.1 VRR. It is a custom implementation with HDMI extensions. So this is actually a first, that a graphics card can push VRR. Xbox One can as well, but that was the very first device of any sort to support it. So Nvidia should get credit for doing it first, and LG is namechecking Nvidia because they helped test LG's implementation of VRR and developed the driver capability alongside it. If this is anything like HDMI 2.0 HDR, then it's going to take a while before TV makers converge on an actual, working paradigm for universal support. My HDR TV had many updates to the feature, and had to be explicitly updated to fix some Xbox One teething issues where the HDR was sent in a different way than the spec. LG had issues on old OLED TVs with black crush and dim picture on HDR content.

So restricting this to just LG's TVs right now also makes sense. Because no other TV maker has the feature implemented. Samsung has FreeSync support, but it is not the same as the standard VRR. As more TV makers actually get this implemented, I'm sure the support will be extended to them.

This is not Nvidia stealing FreeSync HDMI. This is Nvidia beating AMD to the punch in supporting the standard. Everyone trying to cast this as just marketing doesn't really understand the technical hurdles of actually making this happen. I don't think Nvidia will ever support FreeSync HDMI. Future monitors will include HDMI 2.1 VRR and Nvidia will be in a place to enable it more broadly once all of the disparate engineering teams finally implement it in a universal, standards-compliant way. I hope AMD catches up soon.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
if has no hdmi 2.1 then skip.

It has, but there's no gpu that have one. You have to wait next gen gaming cards for that... But any way I don't really know if these thing even have higher than 4K 60Hz panels to really take advantage of HDMI 2.1 bandwidth. Nothing forces manucturer to use 4k high refresh rate panel, hdmi2.1 just makes it possible.

Ok, again:
- LG 2019 OLED TVs have HDMI 2.1 with VRR support. Very much a standard thing.
- What Nvidia arranged was to add support for the same feature in their current GPU lineup that only has HDMI 2.0.
- No extra cost, no effect to the standard feature.

What exactly is here to be upset about?
What vendor lock? Deceiving how?

Yeah I see this as a test vehicle for next gen cards for full support of hdmi 2.1. VRR is _compulsory_ feature for hdmi 2.1 spec, so there's little change to restrict it from other vendors. I'm more concerned that TV maker name is LG, have had bunch of those with crazy bat shit firmwares and malfunctional EDIDs.
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
It has, but there's no gpu that have one. You have to wait next gen gaming cards for that... But any way I don't really know if these thing even have higher than 4K 60Hz panels to really take advantage of HDMI 2.1 bandwidth. Nothing forces manucturer to use 4k high refresh rate panel, hdmi2.1 just makes it possible.

Many, if not all, high end 4k TVs do use 120 Hz panels. They almost all support 1080p 120 Hz. The panel itself is therefore capable of it. This is not a CRT situation where lower resolutions unlock higher framerates from a physical perspective. The pixels can change at 120 Hz, so it is hardware supported at the panel level at 4k. The trick is the image processing for output at 4k. Many of them also have the Soap Opera Effect capabilities that can interpolate 60 Hz to 120 Hz, even 4k sources. So the internal processing can support output at 4k 120 Hz in these cases, and that's a good sign that the connection from the chipset to the LCD driver is set up to work at those high frequencies. The real missing piece is the ability to accept 4k120 over HDMI, which requires seriously high bandwidth and is not something that commodity parts for the HDMI interface can support. This is probably the limiting factor at this point. Now, it would maybe be possible to stream 4k120 video if the TV's GPU can decode at that rate. That is also not very likely for all but the most advanced chipsets, like the ones LG and Sony use. Or it might be "supported" on the white sheet but not implemented yet. And honestly, video at 120 Hz is probably pretty useless, since no major network or production company supports it through the production or transmission pipeline. Even 60 Hz streaming is relatively new for the over-the-top TV streaming services like Hulu TV.

So nothing forces TV manufacturers to use 4k high refresh rate panels, but they all do anyway. LG, Sony, Samsung, Vizio (P series at least). You can check rtings.com for the model to see if it supports 1080p120 and that will tell you what the panel supports for refresh rate. The missing part is the rest of the chain, and my guess is it's really down to the availability of commodity 2.1 support in the HDMI interface board.
 

MikeZTM

New Member
Joined
Oct 25, 2019
Messages
16 (0.01/day)
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better

Not true if you look at display with identical panels like XB273k and XV273k. Surprisingly XV273k with Freesync has better brightness and overall picture quality.
Freesync version of same display usually have same performance just like their G-Sync counterparts. It's just g-sync module came with adaptive overdrive and usually Freesync module don't support that.

And by the way G-sync module is a FPGA so even nvidia themself didn't plan to sell that thing for such long time.

And on big thing about this news is this is first time nvidia supporting VRR via HDMI. No graphics card from nvidia support HDMI 2.1 yet, this is Freesync via HDMI 2.0. Currently only Geforce Beta driver for Windows Insiders have this ability.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Aug 6, 2017
Messages
7,412 (3.02/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better
may wanna change that nickname to big,fat phony.

and uninformed at the same time too.freesync was never any worse than gsync when they both worked as intended.g-sync monitors have ulmb+adaptive v-sync in drivers,and that's pretty much where the advantages end.most pople are not willing to pay the premium for that,though in my experience it's pretty mind boggling,not even in shooters but particularly in parkour or melee first person games like dying light or shadow warrior.If one can afford it I'd definitely recommend for them to try.
there's low framerate compensation too,but trying to make sub-30 fps gaming better is like turbocharging a Zastava so not really of any importance.Who in their sane mind spends $500-1000 on a monitor and cares about LFC.really.
ghosting issues ? they were probably caused by the fact that you could certify any piece of crap as freesync,while for g-sync you could not.

anyway,I reported your poor attempts to get me banned for an alt account to the staff,and we have a pretty decent idea who you are too.strangely your main account activity is somehow related to the posts with the alt one.I guess the only argument you could win on TPU is with yourself :roll: :laugh:
 
Last edited:

MikeZTM

New Member
Joined
Oct 25, 2019
Messages
16 (0.01/day)
That may be, but the 20- and 16-Series GPUs have HDMI 2.0b, which means there is something else going on here.

I wonder if there is any money involved or if it's simply a partnership meant to benefit both parties. I'd really be interested to know the answer to that, because it's possible there are some shenanigans going on here considering that AMD is more-or-less the proprietor of VRR via HDMI 2.0b and earlier versions.

With WDDM 2.7 beta driver on a Windows insider preview build, Geforce card already support VRR via HDMi 2.0b. That's not a secret. And I didn't got any NDA for that.
 
Joined
Jul 13, 2016
Messages
2,843 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Nvidia's implementation was and still is unironically superior, not plagued by the stuttering and the ghosting limitations of Freesync. Plus monitor manufacturers provide g-sync for the highest end and cutting edge monitors because 4K60+ is till implausible on ultra settings for the R-VII/5700XT.

The onboard chip also did most of the heavy lifting so the framerate drops weren't as anywhere as precipitous as freesync. It could've been developed into something much better

1. There is no "heavy lifting" when it comes to VRR. No idea where you got that bit from but it's false.
2. Stuttering? FreeSync is not plagued by stuttering.

You make a few unfounded claims because there isn't a single tech outlet that backs up your claims.
 
Joined
Oct 26, 2018
Messages
203 (0.10/day)
Processor Intel i5-13600KF
Motherboard ASRock Z790 PG Lightning
Cooling NZXT Kraken 240
Memory Corsair Vengeance DDR5 6400
Video Card(s) XFX RX 7800 XT
Storage Samsung 990 Pro 2 TB + Samsung 860 EVO 1TB
Display(s) Dell S2721DGF 165Hz
Case Fractal Meshify C
Power Supply Seasonic Focus 750
Mouse Logitech G502 HERO
Keyboard Logitech G512
It's just g-sync module came with adaptive overdrive and usually Freesync module don't support that.

And by the way G-sync module is a FPGA so even nvidia themself didn't plan to sell that thing for such long time.

What exactly is a "Freesync module"???

FPGA makes more $ense than ASIC for low volumes. Nvidia understands the ecnomics involved.

The title is kind of misleading, should be "LG OLED TV's recieve 'G-SYNC Compatible Certification'."
 
Joined
Mar 7, 2013
Messages
31 (0.01/day)
Location
Australia
Processor AMD Ryzen 9 5900X
Motherboard ASUS Crosshair VIII Hero (WiFi)
Cooling EK-Nucleus AIO CR240
Memory G.Skill Trident Z Neo (F4-3600C16D-32GTZN) 3600MHz
Video Card(s) ASUS Strix 4090
Display(s) ASUS VG27AQL @ 144Hz, Acer XB271HU @ 144Hz
Case Corsair 4000D
Power Supply ROG Thor 1200W Platinum II
Mouse Logitech G703 Lightspeed w/ PowerPlay
Keyboard Logitech G915 TKL
Software Windows 11 Professional - X64
G-SYNC Compatible Now Available On LG 2019 4K OLED TVs, For Smoother, Tear-Free PC Gaming
If you don’t own a LG TV, but do own a display or TV that only supports Variable Refresh Rates via HDMI, you can try enabling HDMI VRR by installing the new Game Ready Driver, and enabling G-SYNC as detailed above. As these displays and TVs have yet to through our comprehensive validation process, we can’t guarantee VRR will work, or work without issue.
Be assured though, we will be testing current and future Variable Refresh Rate HDMI TV displays, and will be working with manufacturers like LG to bring flawless out-of-the-box G-SYNC Compatible support to their big-screen TVs, so more GeForce gamers can enjoy smooth gameplay from the comfort of their couch.

Mm, I find that very interesting and a nice addition. It's also great to hear that lower resolutions can utilise a higher refresh rate. As for all the FreeSync/G-Sync/G-Sync Compatible "debate" going on, I think I'll just continue referring to the objective analysis from BlurBusters & TFTCentral...
 

MikeZTM

New Member
Joined
Oct 25, 2019
Messages
16 (0.01/day)
What exactly is a "Freesync module"???

FPGA makes more $ense than ASIC for low volumes. Nvidia understands the ecnomics involved.

The title is kind of misleading, should be "LG OLED TV's recieve 'G-SYNC Compatible Certification'."

Freesync module is usually part of display's controller's DSP. It's not a physical chip or something but I just need a term to describe where this "adaptive overdrive" feature goes. There's one Freesync monitor that actually support this feature.

This feature is currently the only feature that is missing on almost every Freesync display and that's it. G-SYNC didn't give them any more than this.
G-SYNC module is a total joke considers nvidia is a ASIC design company.
 
Joined
Aug 20, 2007
Messages
20,789 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
False marketing by definition is what they are doing.

Yeah, because everyone cares who developed what standard...

No, in actuality no consumer outside of here gives a shit about that. They just want to know what GPU this solution is promised to work with.

This is promised to work with nvidia, so they used nvidia branding. The fact that most vendors choose nvidia brand indicators over AMD ones is not false advertising either, they probably test it on the market leader (which, like it or not, is nvidia). They may work with AMD too, but few even consider that fact worth marketing anymore. That's a problem with AMDs brand position and generally having been poor competition for 2+ years, not a nvidia scam routine.

Playing the AMD-victim card regularly is going to become part of their brand too, if I keep reading crap like this.
 
Joined
Jan 8, 2017
Messages
8,944 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Yeah, because everyone cares who developed what standard...

This isn't about that. There used to be G-sync and everything else which was branded as Freesync, now everything is slowly being labeled as G-sync compatible. The problem isn't that it's false advertising bur rather that AMD is getting removed for no obvious reason from marketing material making people believe that their hardware is incompatible.

It's irrelevant who's brand is in a better position, when a company uses their brands they do it on the basis of a partnership that entails certain restrictions. And thus we get to the real reason why this is a problem, Nvidia is clearly doing something which makes manufactures want to avoid mentioning AMD.

You know, a year ago Nvidia was also a market leader and AMD in an arguably worse position and yet manufactures where happily slapping AMD/Freesync on everything, surely there is more to it than just brand recognition.
 
Joined
Aug 20, 2007
Messages
20,789 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
rather that AMD is getting removed for no obvious reason from marketing material making people believe that their hardware is incompatible.

The "no obvious reason" is simply that there is no reason to advertise AMD compatability for the limited marketshare they represent. This occured the moment nvidia opened the gsync brand to non gysnc module monitors.

If you want to blame anyone, blame the vendors, and capitalism maybe.
 
Joined
Jan 8, 2017
Messages
8,944 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The "no obvious reason" is simply that there is no reason to advertise AMD compatability for the limited marketshare they represent.

Really ? Let's see.

AMD has limited market share yet their brand manages to make it's way to a lot of manufactures. (This is pre G-sync compatible)

Nvidia enables compatibility on non G-sync displays, AMD's brand starts to get removed from existing and new products.

Potential customers with AMD hardware didn't vanish and might have even grown in numbers so why would a vendor stop advertising something which they previously did with no problem ? This cannot be explained with market share, in fact it makes absolutely no sense, both AMD and Nvidia can help extend their reach. There is no reason to exclude one or the other.

A vendor would always want to list as many compatible products as possible, not just pick one because it has the highest market share. This is of course if nothing constricts them.
 
Last edited:
  • Like
Reactions: HTC
Joined
Aug 20, 2007
Messages
20,789 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Potential customers with AMD hardware didn't vanish and might have even grown in numbers so why would a vendor stop advertising something which they previously did with no problem ? This cannot be explained with market share, in fact it makes absolutely no sense, both AMD and Nvidia can help extend their reach. There is no reason to exclude one or the other.

You can extend that to several brands maybe, but not to the article or monitor/TV we're talking about as this has never had freesync compatability in any form, marketed or otherwise.

That was my point.

Even then, if one has to choose between advertising freesync or gsync, the market choice is obvious. Of course until I see evidence of such an agreement existing I won't even begin to comment on the theoretical anticompetitive nature of it.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,652 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Yeah, because everyone cares who developed what standard...

No, in actuality no consumer outside of here gives a shit about that. They just want to know what GPU this solution is promised to work with.

This is promised to work with nvidia, so they used nvidia branding. The fact that most vendors choose nvidia brand indicators over AMD ones is not false advertising either, they probably test it on the market leader (which, like it or not, is nvidia). They may work with AMD too, but few even consider that fact worth marketing anymore. That's a problem with AMDs brand position and generally having been poor competition for 2+ years, not a nvidia scam routine.

Playing the AMD-victim card regularly is going to become part of their brand too, if I keep reading crap like this.
Renaming and taking advantage of a free tech beneficial for gamers when made by another company in order to sell more of your products, especially when you milked that specific market with the G-sync modules sold inside monitors for years is clearly false marketing and unethical by all means. Everyone is free to believe and think anything as true but logic and facts are above all thoughts and ideas. End of discussion from my side.
 
Joined
Feb 23, 2019
Messages
5,643 (2.99/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
You can extend that to several brands maybe, but not to the article or monitor/TV we're talking about as this has never had freesync compatability in any form, marketed or otherwise.

That was my point.

Even then, if one has to choose between advertising freesync or gsync, the market choice is obvious. Of course until I see evidence of such an agreement existing I won't even begin to comment on the theoretical anticompetitive nature of it.

And to add to that: GSyng compatible badge is not something monitor/TV manufacturer can just slap on the monitor anyway, Nvidia does do validation process all for those too(albeit not quite strict that it does for GSync and Gsync ultimate monitors).

Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.

Again it's forward looking driver support implementation of HDMI 2.1 VRR. All Turing cards that have HDMI video output supports now compulsory HDMI 2.1 VRR, albeit they are not hdmi 2.1 cards. So if your TV has hdmi2.1 output, you can force GSync compatibility from the nvidia control panel.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
And to add to that: GSyng compatible badge is not something monitor/TV manufacturer can just slap on the monitor anyway, Nvidia does do validation process all for those too(albeit not quite strict that it does for GSync and Gsync ultimate monitors).

Indeed, if anything G-Sync compatible helps weed out the good Freesync displays from the trash.
 
Joined
Aug 9, 2019
Messages
226 (0.13/day)
Nvidia is pretty much targeting a niche with it's marketing, since:
- HMDI 2.1 is nonexistent on PC
- PC gamers mostly use desktop displays, not many of them have 55" OLED plugged to their PC
- console market atm is dominated by AMD as hardware supplier,
So if future OLEDs from LG will include VRR support (which AFAIK is required for full HMDI 2.1 support) the ones that will benefit the most from it are owners of nextgen consoles that will be based on AMD hardware.
you are joking? everyone i know uses 43+" tv's plugged to pc, i use 55"nu8000/65"nu8000 and now i will upgrade to the oled LG or a short throw laser projector once the price drops a little and 2.1hdmi comes out fully, im so excited for it cant wait for the 3080ti or what ever will be next, been using a 1080ti for so long it makes me crazy.

if you waste your money on a gaming 27" or smaller shame on you.

funny most people say "oh the 55" doesn't fit my allotted space" well then first thing you need to do is focus on getting a bigger room! then a bigger TV then have a better experience ;P went from 21" trinitron to a LG 37" to a 43"4k to a 55"1440@120hz samsung tv / 65"depending on the room. even my mom uses a 4k 43+" tv for her desktop. welcome to 5+ years ago :)
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
So, LG OLED gets VRR? This better not be locked to one camp...
 
Low quality post by killster1
Low quality post by R-T-B
Joined
Aug 20, 2007
Messages
20,789 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
i wonder if AMD cares about you, you already bought their cards and are in love with them why should they try to fix their drivers :p :p :p :love:

Irrelevant to the discussion and basically just baiting. Don't do that please.
 
Top