• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

LG Reveals Full Specifications and Pricing for the 4K UltraGear 32GS95UE-B OLED Monitor

If you want to believe in it, do not include me, DSC is lossy just like MP3.
Do… do you not understand what “visually lossless” means? Of course DSC does lose SOME information in the process of compression, but the point is that it’s irrelevant since the actual user does not notice that. Neither do colorimeters, by the way, there is no color or gamma change between running with or without DSC. It’s not the case of belief. It’s provably so.
But since you brought MP3 I kinda already get we are in the “feels over facts” situation. I guess you would also say that you absolutely can hear the difference between a high bitrate MP3 recording and a lossless file of the same on consumer level hardware, yeah?
 
Do… do you not understand what “visually lossless” means? Of course DSC does lose SOME information in the process of compression, but the point is that it’s irrelevant since the actual user does not notice that. Neither do colorimeters, by the way, there is no color or gamma change between running with or without DSC. It’s not the case of belief. It’s provably so.
But since you brought MP3 I kinda already get we are in the “feels over facts” situation. I guess you would also say that you absolutely can hear the difference between a high bitrate MP3 recording and a lossless file of the same on consumer level hardware, yeah?
DSC is not visually lossless, if is lossy is not lossless no matter how people want to believe.

FLAC is truly lossless, even if is compressed, you can at anytime decode it to the original wav file, can the same be done by DSC? No, so I leave this quote.

"I hope the term “visually lossless” doesn’t become popular. Lossless is such a technical word that means that something can be recreated data-exact.

If it sounds great, or looks great, but the original signal can’t be reproduced exactly, then it’s lossy. Lossy can be good or bad. It just is what it is."
 
Lg prob make great monitors, my lg fridge died other day.
 
Better to go with the Gigabyte or HP Omen 4K 240hz monitors since those are the only 2 models that offer Displayport 2.1

Do… do you not understand what “visually lossless” means? Of course DSC does lose SOME information in the process of compression, but the point is that it’s irrelevant since the actual user does not notice that. Neither do colorimeters, by the way, there is no color or gamma change between running with or without DSC. It’s not the case of belief. It’s provably so.
But since you brought MP3 I kinda already get we are in the “feels over facts” situation. I guess you would also say that you absolutely can hear the difference between a high bitrate MP3 recording and a lossless file of the same on consumer level hardware, yeah?
The trouble with DSC is that many people on the nvidia forums have stated that they are having screen flickering issues when DSC is enabled at 4K 240hz. Some are even saying that with DSC enabled, the highest they can go is 4K 120hz even when their monitor is spec'd to run at 4K 240hz. Having a Displayport 2.1 socket on the monitor and GPU would completely bypass those bugs and make them a non-issue because DP 2.1 has enough bandwidth to run 4K 240hz without the need for DSC. The upcoming Nvidia 50XX series GPU's and AMD's 8000 series GPU's will no doubt be equipped with at least one DP 2.1 connector
 
Last edited:
Having a Displayport 2.1 socket on the monitor and GPU would completely bypass those bugs and make them a non-issue because DP 2.1 has enough bandwidth to run 4K 240hz without the need for DSC.
Some versions of DP 2.1. There are three UHBR levels. For example, even though current AMD GPUs have DP 2.1, they CANNOT run 4K 240Hz uncompressed. Furthermore, the DP 2.1 standard technically doesn’t require the device to be UHBR capable at all. Yup. You can have a DP 2.1 port that has the same bandwidth as a 1.4a. What the standard DOES however require is support for DSC. That one mandatory. UHBR is not.
 
The upcoming Nvidia 50XX series GPU's and AMD's 8000 series GPU's will no doubt be equipped with at least one DP 2.1 connector
AMD's 7000 series already has DP 2.1.

Honestly all this back and forth looks like someone is trying to justify their purchase trying to pass their bias and statistics as something better the cold and hard facts.
 
Rip-off. Dearer than even Asus' ROG monitor which is much better specced and uses qd-oled. Compared to MSI's offering it looks utter trash value. But this is LG.
 
Dell was recently selling their 32" QD-OLED monitor for $1035 USD equivalent in Canada. The MSI one is selling sub $1000.

I don't see how LG can charge 40 percent more. Disappointed. I guess I will buy the QD-OLED one tomorrow now we know the LG price. I like the built in speakers and that is the only advantage.
 
Yeah true but why buy this monitor then if not 4k 240hz or 480hz 1080?
Because you do not require insane specs, maybe?
 
So wait, is this all some giant conspiracy? DSC is actually visually lossy, but nobody, not the independent experts who tested the algorithm, not the respected reviewers like TFTcentral, RTings or MonitorUnboxed, not the thousands of users all over the world who run these screens daily, not a single one noticed and blew the whistle? Really? That’s what you are saying?
DSC uses 4:2:2 subsampling, it loses chroma information. The thing is, this is mostly visible on text, but nobody needs 120Hz+ and 10bpc to render MS Word, so the issue remains mostly academic.
Also see: https://en.wikipedia.org/wiki/Display_Stream_Compression#Effect

DSC is not visually lossless, if is lossy is not lossless no matter how people want to believe.

FLAC is truly lossless, even if is compressed, you can at anytime decode it to the original wav file, can the same be done by DSC? No, so I leave this quote.

"I hope the term “visually lossless” doesn’t become popular. Lossless is such a technical word that means that something can be recreated data-exact.

If it sounds great, or looks great, but the original signal can’t be reproduced exactly, then it’s lossy. Lossy can be good or bad. It just is what it is."
JPEG is also lossless. When was the last time you opened one and spotted the blockyness?

Why buy this monitor then?
Because you do phot work and you want the wide gamut and the dynamic range?
Because you want a gaming monitor that will do fast refresh without the drawbacks of overdrive?
Because you want to experience HDR the way God intended?
(All of the above assume that you also have $1.5k+ to part with.)
 
DSC uses 4:2:2 subsampling, it loses chroma information. The thing is, this is mostly visible on text, but nobody needs 120Hz+ and 10bpc to render MS Word, so the issue remains mostly academic.
Also see: https://en.wikipedia.org/wiki/Display_Stream_Compression#Effect
It’s a bit of a misconception. While DSC CAN use sub-sampling to get to higher bandwidth applications, it does NOT need it at all times or in all scenarios. It has a 4:4:4 RGB native mode that it operates in for most applications and the main conversion is from RGB to YCoCg. That does not involve subsampling.
There is a lot of good information in the VESA white-paper on DSC. It might be not up to date fully since the algorithm has evolved since from my understanding, but still.
 
It’s a bit of a misconception. While DSC CAN use sub-sampling to get to higher bandwidth applications, it does NOT need it at all times or in all scenarios. It has a 4:4:4 RGB native mode that it operates in for most applications and the main conversion is from RGB to YCoCg. That does not involve subsampling.
There is a lot of good information in the VESA white-paper on DSC. It might be not up to date fully since the algorithm has evolved since from my understanding, but still.
I know all that and that is correct. But to get 10bpc 4k@120Hz, you do need 4:2:2. You're not going to see it unless doing some color critical work, but it's going to be there.
 
I know all that and that is correct. But to get 10bpc 4k@120Hz, you do need 4:2:2. You're not going to see it unless doing some color critical work, but it's going to be there.
Uh, not from VESAs own information. At DP 1.4a (so HBR3) you don’t get to subsampling until 5K 240Hz if we are talking 10-bit color. You can look up the table for that yourself here under “Resolution and refresh rate limits”.
 
Uh, not from VESAs own information. At DP 1.4a (so HBR3) you don’t get to subsampling until 5K 240Hz if we are talking 10-bit color. You can look up the table for that yourself here under “Resolution and refresh rate limits”.
Indeed. The amount of total BS people try to spread around here is comical.
 
No dp 2.1 then no buy from me. Do not give in, minimum manufactures should do is dp 2.1. You as the people have the power to say no to the BS. Buy the ones with dp 2.1 only. Next time they will think twice before doing the BS.

There's no GPU in the market which supports that.
 
can it do 1440p 360Hz?

So the thing about DP and LG is in that screen HDMI is faster. I have a really good feeling using HDMI with any recent high end LG will result in the display switching to 4K 120Hz when starting to play games. It's what happens to LGs when using HDMI. You have to use a 3rd party tool to delete TV resolutions from the displays HDMI metadata or else you will be constantly forced into 4K 120Hz whenever you start a game.

There's no good reason NOT to include DP 2.1 for a brand new $1K+ PC monitor in 2024 but also include the royalty demanding HDMI interface unless it's really meant for consoles.
That was an issue on the 1440p LG oled monitors and it was there to allow for better 4k console compatibilty. It was later fixed via firmware update and other LG woled variants didn’t even have the issue. There will be no way the issue will arise on a native 4k LG monitor.
 
Indeed. The amount of total BS people try to spread around here is comical.
Honestly, I think this may be an honest misunderstanding. I had the exact same discussion last week in another thread with another user. He also was adamant about 4:2:2 chroma being a product of DSC at 4K 10-bit. Turns out, he was thinking back to the first generation of high refresh 4K monitors like the Asus PG27UQ that did not use DSC and were, indeed, limited to subsampling in order to get 120Hz. Without it, they topped out at 98Hz which is, as per the table above, exactly the maximum for uncompressed HBR3 output. People maybe forget, but first DSC using monitors didn’t arrive until 2020. So I think @bug here maybe is just a bit confused as that other user was.
 
No dp 2.1 then no buy from me. Do not give in, minimum manufactures should do is dp 2.1. You as the people have the power to say no to the BS. Buy the ones with dp 2.1 only. Next time they will think twice before doing the BS.

yeah whenever I do do OLED it will be WOLED
 
Lg prob make great monitors, my lg fridge died other day.
I couldn't get my LG fridge to low temps for weeks until my 3yr old found a thin piece of plastic from somewhere I didn't see on the inside and peeled it off.

Getting back on topic: My LG monitor (27UL600-W) compliant is the HDMI won't do 60Hz reliably since the new Navi GPU's came out while the DP is fine. I got mine pretty cheap for $329 Best Buy special at the time.

It hard to believe in 2024 for a US$1399.99 monitor they can't put 2 or 3 DP's and they are still doing 1 DP, 2 HDMI.
 
can it do 1440p 360Hz?

So the thing about DP and LG is in that screen HDMI is faster. I have a really good feeling using HDMI with any recent high end LG will result in the display switching to 4K 120Hz when starting to play games. It's what happens to LGs when using HDMI. You have to use a 3rd party tool to delete TV resolutions from the displays HDMI metadata or else you will be constantly forced into 4K 120Hz whenever you start a game.

There's no good reason NOT to include DP 2.1 for a brand new $1K+ PC monitor in 2024 but also include the royalty demanding HDMI interface unless it's really meant for consoles.

Not at all, i use HDMI due to using a AV receiver.
 
Because you do phot work and you want the wide gamut and the dynamic range?
Because you want a gaming monitor that will do fast refresh without the drawbacks of overdrive?
Because you want to experience HDR the way God intended?
(All of the above assume that you also have $1.5k+ to part with.)
Do you understand that for all that you still don't need to do 240 Hz?
 
If you want to believe in it, do not include me, DSC is lossy just like MP3.
Get used to it because past 4K 240 the bandwidth required for uncompressed stream exceeds that provided by DP 2.1 spec.
 
Uh, not from VESAs own information. At DP 1.4a (so HBR3) you don’t get to subsampling until 5K 240Hz if we are talking 10-bit color. You can look up the table for that yourself here under “Resolution and refresh rate limits”.
This is where you need to look: https://en.m.wikipedia.org/wiki/DisplayPort#Refresh_frequency_limits_for_HDR_video
HDR and 10bpc @4k 120Hz is just outside the realm of HBR3.

Do you understand that for all that you still don't need to do 240 Hz?
I do. But that's sort of built-in when talkin OLED, so what can you do? Ask for 60Hz models?
 
That was an issue on the 1440p LG oled monitors and it was there to allow for better 4k console compatibilty. It was later fixed via firmware update and other LG woled variants didn’t even have the issue. There will be no way the issue will arise on a native 4k LG monitor.
I had the issue with the ips version, 27GR83Q-B. I bought it directly from LG in Dec 2023. Last I checked for an update was mid January and it wasn't fixed, at least for the ips model.
 
Back
Top