• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

LG Unveils First OLED TVs with NVIDIA G-SYNC Support

Joined
Mar 13, 2012
Messages
277 (0.06/day)
Next gen nVidia cards will have HDMI 2.1 and G-Sync will use HDMI 2.1 on both monitors and TV (LCD/OLED) and also DisplayPort where needed.

See this as a cold run and testing for the upcoming G-Sync compatible invasion by nVidia.

nVidia aims to push down FreeSync as much as possible with a massive G-Sync Compatible push the coming years.

The normal G-Sync module with DP 1.2 and 8-bit color will live on for a while as long as manufacturers want it and then put out to pasture without being replaced with new hardware.

The new normal will then be G-Sync Compatible and for so called medium upper class and high end G-Sync Ultra will be used.

By the way, nVidia is actually working on a custom chip for fan less version of G-Sync Ultra module BUT it is still up to two years away before it appears in products, they want to get away from the expensive and hot Intel's FPGA unit - Altera Arria 10 GX 480
 
Joined
Oct 31, 2013
Messages
186 (0.05/day)
I totally forgot, how about burn in issues for these TVs?

Wasn't there an Australian lawsuit against LG (they lost) because they didn't inform the customer properly about this issue and that it is not covered by warranty?

And can the FPGA unit mine coins when i don't play games and use G-sync?
 
Joined
Aug 31, 2016
Messages
104 (0.04/day)
Great news. VRR adaptation has been really slow on TV market and it is great to see a major manufacturer actually taking effort to make it work properly. And despite all the reasonless hate having a certification like G-sync Compatible that actually guarantees some standards instead of being slapped on everything regardless of whether it works or not like it is the case with FreeSync is a good thing also, everyone with at least some experience with various Adaptive Sync enabled desktop displays knows that. Not to say that there are no good Adaptive Sync implementations, but they are all over the place.

Not really going to make me switch from an ultrawide monitor to a TV for games, even despite vast imagine quality advantage these OLEDs have, I will still use OLED TV for movies only, but options are always good and there are certainly many people for who this announcement is a game changer.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
See this as a cold run and testing for the upcoming G-Sync compatible invasion by nVidia.

nVidia aims to push down FreeSync as much as possible with a massive G-Sync Compatible push the coming years.
This has already started. Haven't you seen the monitor announcements from before IFA? Barely a mention of FreeSync or Adaptive Sync, but a surprising amount of "G-sync compatible" launches.

Have to wonder just how much Nvidia is incentivising this. Free certifications? Marketing support? Also, judging by the press releases and spec sheets presented it might seem like something is preventing manufacturers from mentioning competing brandings/VRR solutions alongside GS Compatible (after all any GS compatible DP display is by definition FreeSync and VESA AS compatible) - I wonder what the terms of that program are like.
 
Joined
Aug 31, 2016
Messages
104 (0.04/day)
Wasn't there an Australian lawsuit against LG (they lost) because they didn't inform the customer properly about this issue and that it is not covered by warranty?

You can read about it on the web. It was about misinforming the customers about their rights, two customers to be precise. LG there acted like if their warranty was the ultimate law and consumer guarantee rights didn't exist, for which they were fined $160k. This kind of behavior is not uncommon by big companies and shops who think they can make their own law, especially if the law is weak and allows for poor warranties, restocking fees upon returns and other anti-customer practices. I don't know the law in Australia or LG's policy there, but since these were only two customers I guess this was more of a local issue, some bad employees handling the RMA wrong or something like that.

Though I heard that the warranty on TVs can be really bad depending on what the law in the region allows, apparently there are places where they can just say that burn-in is normal and not covered by warranty and it is perfectly legal.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
This has already started. Haven't you seen the monitor announcements from before IFA? Barely a mention of FreeSync or Adaptive Sync, but a surprising amount of "G-sync compatible" launches.
Have to wonder just how much Nvidia is incentivising this. Free certifications? Marketing support?
GSync Compatible has specific requirements that are useful for gaming and show quality of the solution to some degree. DP Adaptive-Sync only shows that the feature exists and Freesync mirrors that. Even if Nvidia is not incentivising in any substantial way there is value in this.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Got to admit, nVidia wedging their branding everywhere is getting more and more tiresome. At least Samsung hasn't folded... yet...
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I totally forgot, how about burn in issues for these TVs?
Since the misconception about "burn-in" comes up in every OLED related thread, I will just quote my previous post on the matter:
Actual burn-in is something that happened with the CRT technology if the phosphorus were statically exposed too long. OLED and plasma technologies don't have actual burn-in, but they have a problem people commonly confuse with burn-in; uneven wear. The distinction between this and burn-in is very important, both to judge if it is a problem for your use case, and to prevent issues for those who end up buying these screens.

All current OLED displays suffer from this problem, ranging from mobile devices, to laptops and televisions. The problem is causes by some pixels being more worn than others, caused by a usage pattern where certain portions of the screen is on average significantly lighter than the others. It has nothing with pictures being static or not, but whether your usage pattern is uneven enough. You can use your phone or television all day and have no static pictures over time, but if your usage patterns continue to be uneven, it will cause what people mistakenly call "burn-in" to gradually get worse. And the other way; if your average brightness is fairly stable, you can have completely static pictures for hours every day without any problem, because it has nothing to do with them being static or not, so let's end that myth right now.

The good news is that OLED is more robust to wear when the brightness is not very high compared to plasma, which means you can have things like GUI elements on your gaming screen all day, and as long as they are not much brighter than the screen average, you will have no problems with "burned-in" patterns. Bright static or moving patterns should concern you, but not because they are static or not, but because they are brighter than the rest of the screen over time.

So would I personally buy a OLED monitor/TV? Well it depends on how I would use the screen. I wouldn't buy one for kids to use without supervision, and I wouldn't go with OLED for a screen that displays TV-content all day, and I wouldn't go for OLED for a monitor displaying bright web pages all day either. But for a living room "TV" setup for movies and gaming I would certainly go for the superior picture quality of OLED. A well "cared for" OLED (or plasma) will of course have minor imperfections in the panels after several years of usage, but not even close to what every LCD panel already have from the factory, so I wouldn't worry if the usage is right.

When it comes to "protective" mechanisms in televisions, these are pretty much the same for OLED as for plasma; pixel shifting and some dimming for very bright screens over time. These may help with sharp uneven wear for edges of patterns, but not for the overall "problem".
 
Joined
May 8, 2018
Messages
1,495 (0.69/day)
Location
London, UK
"1440p @ 120 Hz : 6.9 ms " maximum 120hz you can go with it. Good thing is that, it has hdmi 2.1 and support vrr but what is the point of hdmi 2.1 if 4k cant go to 120hz? So move along nothing to see here, perhaps next year lg oled tv 55e9 will be 4k 120hz then it will be a good buy.

Since the misconception about "burn-in" comes up in every OLED related thread, I will just quote my previous post on the matter:

So basically, oled tvs are only for a time if you use too much. I use my monitor at least 18 hours per day, in my case oledtv is maximum 2 years or so. I think is good even if is only 2 years of use that I will have although price needs to be cheaper to buy a new one or simpler, change the panel.
 
Joined
Aug 31, 2016
Messages
104 (0.04/day)
Got to admit, nVidia wedging their branding everywhere is getting more and more tiresome. At least Samsung hasn't folded... yet...

This is not "nVidia wedging their branding everywhere", these are various display manufacturers partnering with market leading GPU manufacturer to their own benefit, why else would they do that...

Also it is not some empty branding, it is a certification that guarantees a certain standard to the customer. And for products from manufacturers like LG or Samsung, who's VRR/Adaptive Sync implementations so far were terrible 9 out of 10 times, with ridiculously narrow ranges and tons of flicker, this is especially useful, and especially for TVs that typically don't even mention essential things in their official specification. I don't really see a valid reason to complain here, unless you have some irrational averse to NVIDIA or any other manufacturer.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Jun 27, 2016
Messages
290 (0.10/day)
System Name MacBook Pro 16"
Processor M1 Pro
Memory 16GB unified memory
Storage 1 TB
Nah thanks, I still don't want to sell my family to slave merchants.
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,363 (3.76/day)
Location
London,UK
System Name Codename: Icarus Mk.VI
Processor Intel 8600k@Stock -- pending tuning
Motherboard Asus ROG Strixx Z370-F
Cooling CPU: BeQuiet! Dark Rock Pro 4 {1xCorsair ML120 Pro|5xML140 Pro}
Memory 32GB XPG Gammix D10 {2x16GB}
Video Card(s) ASUS Dual Radeon™ RX 6700 XT OC Edition
Storage Samsung 970 Evo 512GB SSD (Boot)|WD SN770 (Gaming)|2x 3TB Toshiba DT01ACA300|2x 2TB Crucial BX500
Display(s) LG GP850-B
Case Corsair 760T (White)
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Corsair AX760
Mouse Logitech G900
Keyboard Duckyshine Dead LED(s) III
Software Windows 10 Pro
Benchmark Scores (ノಠ益ಠ)ノ彡┻━┻
Nvidia payed LG a tonne of money... Because it would of made more sense if these were 'FREESYNC' panels/TVs given the fact that AMD pretty much has the console market nailed...

I wonder if LG approached AMD first about the marketing strategy but they turned it down so LG turned to Nvidia who just happened to be sitting in the same office at the time?

I wouldnt say sales would of been through the roof for Freesync TVs but It would of definitely added to the appeal if any at all.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Freesync is an AMD trademark, ironic really.
 
Joined
Sep 17, 2014
Messages
20,917 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Screw Gsync. Give me panels smaller than that 55 inch and we can talk. 1080p/1440p and 24~32 inch... we want YOU
 
Top