• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

In search for the "perfect" monitor...

There are only two OLED manufacturers that are relevant currently in desktop monitors. LG and Samsung. At 1440p 27 inch there are two panels - an older WOLED LG one that all the 240hz displays are based on and a new QD-OLED one from Samsung that’s in the 360hz displays that are coming out now. If budget isn’t an issue for you, the newer panel is better. It has better peak and sustained brightness, better text rendering and has a good semi-glossy-ish coating that seemingly doesn’t affect colors and deals with glare quite well. Samsung also claims that they have “solved” burn in with their OLED care systems, so… take it for what it’s worth.
Now the LG WOLED based monitors are by no means bad. They are just strictly inferior to newer models. But if the price differential where you live is too much to swallow, they aren’t a bad option at all.
Glossy vs matte isn’t a choice here. All the QD-OLED displays are semi-glossy. It’s a good coating, haven’t seen any complaints from the reviewers. All the WOLED ones to my knowledge are light AG coating which has been known to be divisive. There IS a glossy option with the Dough Spectrum AFAIK when ordering from them, but I personally wouldn’t touch anything from that company with a ten foot pole.
 
I think if it was me I'd wait for the 4k240hz panels coming from LG later this year so far QDOLED is much more prone to burn in than WOLED even if in the actual way they are meant to be used it is unlikely.

I've been using Oled as a monitor for 3 years and I will never go back but I'm also not concerned with replacing my display every 4-5 years it's my normal cadence of when new generation monitor/displays are clearly better.
 
Hardware Unboxed do some great reviews from best bang for buck to high end expensive

 
I think if it was me I'd wait for the 4k240hz panels coming from LG later this year so far QDOLED is much more prone to burn in than WOLED even if in the actual way they are meant to be used it is unlikely.

I've been using Oled as a monitor for 3 years and I will never go back but I'm also not concerned with replacing my display every 4-5 years it's my normal cadence of when new generation monitor/displays are clearly better.
Thanks for explaining things! I've also watched a few videos that pretty much confirm what you say.
QD-OLED Glossy it will be.

I think if it was me I'd wait for the 4k240hz panels coming from LG later this year so far QDOLED is much more prone to burn in than WOLED even if in the actual way they are meant to be used it is unlikely.

I've been using Oled as a monitor for 3 years and I will never go back but I'm also not concerned with replacing my display every 4-5 years it's my normal cadence of when new generation monitor/displays are clearly better.
If that's something you can affoard why not.
Sell the old monitor used to get some cash back and pay a little extra to get a new one, makes sense.

Hardware Unboxed do some great reviews from best bang for buck to high end expensive

Thanks for the recommendation, HWunboxed also has a dedicated channel "monitors unboxed"
Haven't seen the particular video you sent so i'll have a look.

Few more questions.
ABL (Auto Brightness Limiter) works different from monitor to monitor and manufactuerer to manufacturer?
Can ABL be controled (disabled/enabled) by the user and is there risk doing so?
How often and how long does pixel-refresh take on average?
If i turn of HDR on an HDR-OLED-Monitor, will the monitor revert to using SDR or what exactly is the impact of that, it will still use 10bit if that's the panel, right?

Just seen this video from HDTVTest about burnin, that explains alot.

And again rewatched this video from TechAltar, it's certainly very interesting.

My takeaways from that videos are:
Samsung QD-OLED is brighter than LG and brightness matters for OLEDs.

Four upcomming new innovations:

1)
samsung developing tandem oled, two oled panels above each other.
this results in double the brightness and four times the lifespan.
expected launchdate 2024

2)
a company named udc claims their ph-oled-material is comming to samsung & lg by 2025
current material used for blue sub-pixel will be replaced with new material.
this new material is said to be four times as efficent at converting electricity into light,
leading to massive brightness improvements.

3)
Micro-OLED, pixels are desposited directly on the silicon,
meaning the whole control electronics basically becomes a chip
resulting in tiny and insanely high resolution oled.

4)
samsung is experimenting with replacing their blue-organic-leds completely in their qd-oleds
with gallium nitride "gan nanorod" which are not organic and dont burn in.

It really appears to be worth waiting untill 2025. Especially when the nvidia 50series is on the horizon.

I think it really comes down to...

• Resolution: 2560x1440p
• Refreshrate: 360Hz
• Panel Depth: 10Bit
• Colors: 24Bit
• Responsetime: 1ms or less MPRT
• Size: 27inch
• Ratio: 16:9 or 21:9
• Panel Type: OLED*
• Technology: Gsync
• Brightness: 1000nits or more
• Connector: Displayport 1.4 or 2.1
 
Last edited by a moderator:
• Technology: Gsync
Why would you want to pay the gstink tax when freesync works on both AMD and nVidia without the extra tax $$$
 
Why would you want to pay the gstink tax when freesync works on both AMD and nVidia without the extra tax $$$
Gsync hardware module performs better than freesync, which barely even requires verification and testing, so performance can be all over the place in freesync "rated" monitors.

What you pay is what you get.

Similar to how both NVIDIA and AMD have frame generation, but the AMD solution doesn't have an equivalent to the optical flow accelerator on NVIDIA, so it's not exactly "equivalent", with noticeably worse IQ.
 
Gsync hardware module performs better than freesync, which barely even requires verification and testing, so performance can be all over the place in freesync "rated" monitors.
Kinda. FreeSync (or Vesa Adaptive sync, technically) CAN work just as well as GSync with a dedicated module if the monitor manufacturer has created a well tuned scaler. The Samsung one in the higher end SVA Odyssey screens is excellent, for example, even providing variable overdrive. But you are rolling the dice, pretty much, unless you've studied reviews for a specific model diligently. There are some truly awful FS implementations out there. While the GSync module is guaranteed to perform as well as possible pretty much always.
Should be noted that since OP is looking at OLED displays this is much less of a big deal. The biggest selling point of GSync modules is, as mentioned, variable overdrive, and since OLED screens don't use overdrive anyway the difference is miniscule. Certainly, I haven't seen any reliable criticism of how VRR performs on current OLED screens, whether the older WOLED ones or newer QD-OLED ones.
 
@Mandolo
iirc, replacing the blue on oleds is so you wont have a color shift when it gets older,
as brightness drops unevenly between colors (green/red "last" longer),
not to prevent burn in.
 
Kinda. FreeSync (or Vesa Adaptive sync, technically) CAN work just as well as GSync with a dedicated module if the monitor manufacturer has created a well tuned scaler. The Samsung one in the higher end SVA Odyssey screens is excellent, for example, even providing variable overdrive. But you are rolling the dice, pretty much, unless you've studied reviews for a specific model diligently. There are some truly awful FS implementations out there. While the GSync module is guaranteed to perform as well as possible pretty much always.
Should be noted that since OP is looking at OLED displays this is much less of a big deal. The biggest selling point of GSync modules is, as mentioned, variable overdrive, and since OLED screens don't use overdrive anyway the difference is miniscule. Certainly, I haven't seen any reliable criticism of how VRR performs on current OLED screens, whether the older WOLED ones or newer QD-OLED ones.
ULMB AFAIK is exclusive to Gsync modules.
 
@dgianstefani
Yeah, that exact technology (or branding, I suppose) is, but there are other blur reduction techs that are not. DyAc from Benq or PureXP+ from Viewsonic are no worse really and do not require a GSync module. So YMMV there.
Again, sort of a moot point since no OLED screen so far, to my knowledge, included any BFI tech at all.
 
@manol0
make sure to read up on burn in vs image retention, as many seem to mislabel or not know the difference.
on past LG oled tvs we sold in the store i worked, virtually all had (only) issues with IR,
within about 8-12 month of running from 9-9 each day showing clips,
while the Sony using the same panel did not, so it seem more related to sw (than hw).

@Onasi
while not monis per se, but all LG/Sony oleds i have seen, have BFI,
and some recently/upcoming 240Hz monis seem to have it.
 
Last edited:
@Waldorf
Yup, you are right, at least the Asus 4K 240Hz one does seem to have BFI under their ELMB feature. I was looking at the 27 inch ones since the OP wanted that size and those seemingly do not, even the new 360Hz models, which is curious. And no one EXCEPT for Asus has announced the support for the feature, while Asus claimed that the upcoming 480Hz WOLED model will also have ELMB. Hard to say what is happening here, maybe everyone except for Asus decided to abandon the support due to it requiring the GSync module to save costs and/or consider it unnecessary on OLED due to fast response times.
 
@dgianstefani
Yeah, that exact technology (or branding, I suppose) is, but there are other blur reduction techs that are not. DyAc from Benq or PureXP+ from Viewsonic are no worse really and do not require a GSync module. So YMMV there.
Again, sort of a moot point since no OLED screen so far, to my knowledge, included any BFI tech at all.
G-Sync Ultimate is synchronised down to 1 Hz, Freesync is down to 9/48 Hz depending on the tier, and use LFC when out of their range. G-Sync also has slightly lower input lag, although this is a minimal advantage on OLED due to the already low values. G-Sync Ultimate also comes with excellent HDR calibration (part of the certification process), and allow both HDR1000 and G-Sync to be active at the same time. G-Sync modules also offer good hardware timing accuracy, rather than the crappy software scalers in Freesync/G-Sync compatible monitors.
 
@Onasi
true, i missed the 27in part, as it cant be "too big" for me :D

yeah, thats some BS. to have it on one model, but not on different size.
 
G-Sync Ultimate is synchronised down to 1 Hz, Freesync is down to 9/48 Hz depending on the tier, and use LFC when out of their range. G-Sync also has slightly lower input lag, although this is a minimal advantage on OLED due to the already low values. G-Sync Ultimate also comes with excellent HDR calibration (part of the certification process), and allow both HDR1000 and G-Sync to be active at the same time. G-Sync modules also offer good hardware timing accuracy, rather than the crappy software scalers in Freesync/G-Sync compatible monitors.
I mean, I have been running a GSync screen for years, so I am aware. The issue is that, again, for all of the benefits it seems that manufacturers (except for Asus, mostly) seem to be going away from a dedicated GSync module. To my knowledge, all of the 27 inch WOLEDs were FS/GS Compatible. Seems the trend is continuing with the 360Hz QD-OLED models. Sure, at least the Asus 4K QD-OLED 32 incher does have the module, but so far I am not sure if the others do. And judging by thearticle just posted about the new WOLED 240Hz 4Kfrom LG, that does not have a module either. Thats a 1400 bucks MSRP display, btw, so that feels especially scummy. The Asus one that will come with the same panel later, though, should.
So in short, I guess we are in a timeline where if you want a GSync module on your already pricey display, you have to pay "the Asus tax". Fabulous.
 
I mean, I have been running a GSync screen for years, so I am aware. The issue is that, again, for all of the benefits it seems that manufacturers (except for Asus, mostly) seem to be going away from a dedicated GSync module. To my knowledge, all of the 27 inch WOLEDs were FS/GS Compatible. Seems the trend is continuing with the 360Hz QD-OLED models. Sure, at least the Asus 4K QD-OLED 32 incher does have the module, but so far I am not sure if the others do. And judging by thearticle just posted about the new WOLED 240Hz 4Kfrom LG, that does not have a module either. Thats a 1400 bucks MSRP display, btw, so that feels especially scummy. The Asus one that will come with the same panel later, though, should.
So in short, I guess we are in a timeline where if you want a GSync module on your already pricey display, you have to pay "the Asus tax". Fabulous.
Gsync is like Betamax. Even though it was better tech VHS dominated from openness. These new 4K 240Hz monitors are already expensive. Adding another module that gives you 5% more performance would not work with that notion. That would make a $1400 monitor like $1700.
 
Gsync is like Betamax. Even though it was better tech VHS dominated from openness. These new 4K 240Hz monitors are already expensive. Adding another module that gives you 5% more performance would not work with that notion. That would make a $1400 monitor like $1700.
People who are buying monitors that cost more than $1k don't really care, they just want the best. For the rest there's cheaper models.
 
People who are buying monitors that cost more than $1k don't really care, they just want the best. For the rest there's cheaper models.
You are acting like a 32" monitor for $1400 is normal. Not everyone buys Asus and Alienware panels. I understand but you have to realize that not everyone into PCs can easily afford that much for a panel. Where I live a good 165Hz 1440P panel is about $400. The adoption of Gsync is nowhere near as entrenched as Freesync. Even VRR in TVs is based on Freesync. BTW that is a joy when you put a GPU that supports 4K 120Hz so that even TVs give you tear free Gaming. Most of us bought those Korean Monitors from QNIX and others a few years ago because of the price/performance.
 
You are acting like a 32" monitor for $1400 is normal. Not everyone buys Asus and Alienware panels. I understand but you have to realize that not everyone into PCs can easily afford that much for a panel. Where I live a good 165Hz 1440P panel is about $400. The adoption of Gsync is nowhere near as entrenched as Freesync. Even VRR in TVs is based on Freesync. BTW that is a joy when you put a GPU that supports 4K 120Hz so that even TVs give you tear free Gaming. Most of us bought those Korean Monitors from QNIX and others a few years ago because of the price/performance.
People who are buying monitors that cost more than $1k don't really care, they just want the best. For the rest there's cheaper models.
 
@dgianstefani
Okay, so just to get back to you on our talk about BFI, ULMB and GSync. Turns out, the new Asus PG32UCDM QD-OLED does support BFI under the ELMB feature. However, it does NOT have a dedicated GSync module, just compatibility. So at this point it seems that ULMB support of a dedicated module really is just a philosophical difference if even Asus decided to go their own way with an ostensibly flagship screen.
 
Why would you want to pay the gstink tax when freesync works on both AMD and nVidia without the extra tax $$$
Only reason i want gsync is because i will purchase an nvidia graphics card. If it was AMD i'd go for freesync instead.
Btw i'm fascinated with AMD's freesync, been playing "a plague tale requiem" with my 1080 Ti and had 30-40 FPS with high settings, unplayable.
Then i got that mod installed to enable AMD's upscaleing "FSR 2" and had 70-80 FPS with quality mode.
The dialogue of that game was ridiculous cringeworthy but the graphics where like nothing i've seen before in a videogame and it was so smooth with FSR.
AMD did a great job there.

Will be getting a ryzen next. Screw intel.
I still hate that amd participates in the nsa spyware program (PSP - platform security processor) just like intel does with their "ME - management engine" but amd definitely caught my attention with the 78000 X3D.
 
Only reason i want gsync is because i will purchase an nvidia graphics card. If it was AMD i'd go for freesync instead.
Btw i'm fascinated with AMD's freesync, been playing "a plague tale requiem" with my 1080 Ti and had 30-40 FPS with high settings, unplayable.
Then i got that mod installed to enable AMD's upscaleing "FSR 2" and had 70-80 FPS with quality mode.
The dialogue of that game was ridiculous cringeworthy but the graphics where like nothing i've seen before in a videogame and it was so smooth with FSR.
AMD did a great job there.

Will be getting a ryzen next. Screw intel.
I still hate that amd participates in the nsa spyware program (PSP - platform security processor) just like intel does with their "ME - management engine" but amd definitely caught my attention with the 78000 X3D.
You do realise that freesync is only the DP option of variable sync that's included in the Displayport vesa standard right nVidia just want you to buy extra equipment inorder to do that very thing and make it proprietary to nVidia GPU's only
 
DSC (Data Stream Compression) must be use for 240hz, which is where the problem lays. You want Chroma 4:4:4 aka full colors, you can't have it. I personally think DSC makes colors dull and ugly. This is why I only run my 1440P at 144hz instead of 165Hz.

Why buy a OLED if the colors will be TN panel level with DSC?
Im running DSC on an 240hz OLED now, and the colors are better than my gigabyte 32U IPS at 144hz... also it passes the chroma 4:4:4 tests and the rtings test... so im confused about this statement.
ChromaRes.png (1920×1080) (madshi.net)

Also not seeing any IQ difference between 120hz and 240 hz either in content or on lagom.nl
 
You do realise that freesync is only the DP option of variable sync that's included in the Displayport vesa standard right nVidia just want you to buy extra equipment inorder to do that very thing and make it proprietary to nVidia GPU's only
GSync hasn’t been NV only for a while now, all current modules support FS/Vesa Adaptive Sync and work with every GPU.
Im running DSC on an 240hz OLED now, and the colors are better than my gigabyte 32U IPS at 144hz... also it passes the chroma 4:4:4 tests... so im confused about this statement.
Read further, @ir_cow was mixing up old displays that were not running DSC and resorted to subsampling with how DSC itself works. We cleared that up.
 
@kapone32
only because its coming with the newer hdmi certifications, otherwise, there would still be no sync on those screens,
and not because its amd.

@Mandolo
you might still be using freesync, depending on the screen.
i have a 2080S, but im using a tv, so win uses freesync ( but Nv panel uses gsync tho)
 
Back
Top