• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Multiple sharp resolutions on a LCD-screen ?

Joined
Jan 30, 2018
Messages
144 (0.05/day)
Location
Finland
Hello
LCD=flat screens have only 1 main resolution and all other ones are blurred.
In 2018:
Has anyone yet invented a modern LCD flat screen that offers all resolutions crystal sharp ?
- Like 320, 480, 640, 1024 and such...

Why trillionaires don't invent such screens ? Waited for 15 years...Why no progress ? :banghead:
 
Nope.

Software has evolved to support multiple resolutions.
 
In all fairness many LCD's support multiple resolutions but have a singular native resolution. As long as you maintain the same aspect ratio you can change down to a lower resolution if necessary and still maintain good screen fidelity.
 
It's not going to happen. Resolutions are based on pixel counts. And pixel counts are based on the actual number of transistors. If the monitor has 1920 rows and 1200 columns of transistors, then 1920 x 1200 is going to be the ideal (native) resolution. Period. The "grid" pattern on any given LCD is set in stone - or rather a "physical" property of that particular screen. Any other set resolution via software is basically going to be a simulation/compromise. As long as the same aspect ratio is maintained, they can scale down to lower resolutions and still keep pretty sharp detail, but it will never be as sharp as the native resolution simply because it is not physically possible.
Waited for 15 years...Why no progress ?
There has been tremendous progress. Densities (number of transistors per square inch) has increased significantly. Remember, these pizels are square. That's fine for straight lines but for curves, no matter how small the square, there will still be jagged edges. But the smaller the squares the smoother those curves "appear" to the human eye. And there have been significant advances in those areas.
 
It's impossibe to achieve that because pixels are the physical limiting factor. What you can do to achieve that is using specific ratios, preferably resolutions where pixels are evenly distributed through physical pixels grid. Best way to achieve this is to go through resolutions one by one and observing details on screen. It's all down to how is rendered pixel at lower resolution split between physical pixels on the screen. If it's split nicely, you'll get somewhat sharp image, if it gets split between 3 pixels, it'll become a horrible blurry mess.
 
Actually other resolutions can in fact look just as "sharp" as long as they are a submultiple of the native resolution on each axis. For example 1080p should look fine on a 4K monitor or 720p on a 1440p monitor.
 
It's impossibe to achieve that because pixels are the physical limiting factor.
Beat you by 14 minutes! LOL
Actually other resolutions can in fact look just as "sharp" as long as they are a submultiple of the native resolution on each axis.
Not really. As I said above,
As long as the same aspect ratio is maintained, they can scale down to lower resolutions and still keep pretty sharp detail, but it will never be as sharp as the native resolution simply because it is not physically possible.
Now for sure, someone like me (someone with 66 year old eyes) may not be able to see the difference with my eyes, but a 18 year kid with perfect eyesight probably could.
 
Not really.

Well... not really. A 1080p signal on a 4K mointor , for example ,will look just as good as it would on a native 1080p panel because there is no interpolation that takes place at all. All pixels are scaled by an integer factor on each axis.

1080p on 4K , 720p on 1440p or 4K are examples of resolutions that will look just as crisp as on a panel of that respective native resolution. There are TVs out there where 1080p doesn't look fine at all but that's due to shit scaling software that still tries to interpolate the signal and not because of any psychical limitation. Even then you can bypass all of this by having the game or the driver perform this scaling internally.

If you don't believe me use Nvidia's DSR with the blurring disabled and notice how only when the factor is set to 4.00x the images looks pixel perfect , it's the same concept at work but in reverse.


1522855222956.png
 
Last edited:
@Bill_Bright @Vya Domus The irony is that you're both right. :)

Sending a sub-multiple signal, eg 960*540 to a 1920x1080 monitor will fit perfectly and should result in a perfectly crisp, if blocky picture. However, the scaler on almost all monitors insists on applying anti-aliasing and thus messing up the picture, making it look horribly soft.

Same thing with a 2K (1080p) signal sent to a 4K monitor, as far as I can see, they're gonna anti-alias it (I've seen it on one monitor) meaning that a 2K signal looks worse on a 4K monitor than on a 2K one when there's no reason for it. This would really piss me off if I had a 4K TV and watched regular HD programming on it. I think the TVs commit the same cardinal sin on this one, lol.

The bit that really gets me, is that there's no way to ever turn off the antialiasing in the monitor's or TV's menu.

The only way round it that I've found is to use the NVIDIA control panel to frame the 960x540 picture in a 1080p signal (scale to fit) and then send that, without any anti-aliasing.

@Vya Domus That's a really great set of illustrations. :)
 
Last edited:
The bit that really gets me, is that there's no way to ever turn off the antialiasing in the monitor's or TV's menu.

Yes , most of them still upscale everything but there are 4K TV's out there that have scalers smart enough not to do that when it's not the case. That's where the misconception lies within this discussion , there is no physical limiting factor that has to do with pixel formats and aspect ratios , it's an unfortunate software related problem.

The only way round it that I've found is to use the NVIDIA control panel to frame the 960x540 picture in a 1080p signal (scale to fit) and then send that, without any anti-aliasing.

Or just do this and not have to deal with anything else.
 
Last edited:
Sending a sub-multiple signal, eg 960*540 to a 1920x1080 monitor will fit perfectly and should result in a perfectly crisp, if blocky picture.
Huh?

I think we are talking different things here. "Perfectly crisp if blocky"??? A perfect circle has no blocks. It does not matter the resolution. But on a higher resolution display, those blocks will be smaller deceiving our eyes into thinking that circle is perfectly round with a perfectly smooth (not blocky) circumference.
 
The idea is to simply get rid of the interpolation that gets applied automatically by the TV.
 
Huh?

I think we are talking different things here. "Perfectly crisp if blocky"??? A perfect circle has no blocks. It does not matter the resolution. But on a higher resolution display, those blocks will be smaller deceiving our eyes into thinking that circle is perfectly round with a perfectly smooth (not blocky) circumference.
I've read my original post and I don't understand your confusion. :confused:

The idea is to simply get rid of the interpolation that gets applied automatically by the TV.
It's what Vya said ^^

Expanding on it, anti-aliasing (AA) is the pixel interpolation that makes the picture look soft in an attempt to reduce the visibility of the jaggies. Without it, a digital picture simply looks blockier and blockier as the resolution drops, hence creating ever bigger pixels which can be clearly seen. However, the edges of those pixels will always look perfectly sharp and that's the key. Obviously at something really low like 320x200, the picture would look terrible, anti-aliased or not, but would still look perfectly sharp (ie clearly defined, very blocky pixels) without AA.

However, when displaying a 2K picture on a 4K monitor, the AA actually degrades the picture. And it's so unnecessary, because the 2K picture divides perfecly into the 4K one and hence not leaving any rounding errors leading to uneven width pixels that might look better anti-aliased.

Another reason why it looks so bad, is that the crude AA applied by the scaler isn't anywhere near as good as what graphics cards can do.
 
I've read my original post and I don't understand your confusion.
My confusion is that you seem to think a jagged or "blocky" (your word!) display of a curved image at a lower resolution is somehow "perfect" and of the same quality of the a same curve displayed at a high resolution.

It ain't happening regardless how sophisticated and advanced the interpolation and scaling may be. If my tired 66 year old eye can see that yours should too.
 
My confusion is that you seem to think a jagged or "blocky" (your word!) display of a curved image at a lower resolution is somehow "perfect" and of the same quality of the a same curve displayed at a high resolution.

It ain't happening regardless how sophisticated and advanced the interpolation and scaling may be. If my tired 66 year old eye can see that yours should too.
No, that's not what I'm saying at all, you're definitely confuddled. :)

Look, all I'm saying boils down to having the scaler turn off AA when the input signal is a sub-multiple of the native resolution of the monitor, eg 960x540 picture on a 1920x1080 display, or 1920x1080 picture on a 3840x2160 display. At least if it was user configurable in the monitor then that would be acceptable, but no, they insist we have AA regardless, which spoils the picture.
 
Last edited:
No, that's not what I'm saying at all, you're definitely confuddled
At you, that's true. Because all I have been saying all along is that when you scale down an image, the quality is degraded. It is you and Vya who are arguing that such scaling and interpolation results in "perfect" rendering of the image. And that is simply not true. So I fail to see why you keep arguing about how great scaling and interpolation is when you yourself admit the image is blocky.

It seems you just want to argue while at the same time admitting the scaled down image is blocky. That just makes no sense.

I admit monitor makers are good at what they do. But ask any professional photographer or CAD/CAE/CAG designer/programmer if blocky curves are acceptable and unnoticeable and they are going to say, "No!"

Are they fine for most gamers? Probably. I mean where would Minecraft be if not fine?

But the OP asked why can't LCD displays use non-native resolutions and still produce "all resolutions crystal sharp"? The answer is because it is not "physically" possible - regardless what you and Vya believe and want everyone else to believe.
 
So to play Warcraft 1 and Carmageddon 1 640x480 and 800x640 I need that old big 10-20 kg mammoth CRT ? ( Images on CRT-screens looks like a bubble / barrel distortion. :( )
- Or is any modern flat screen able to make clear sharp pixels with those 1990s as native resolution ? Screen size at least >15".

I bought in 2004 a ViewSonic VG500 15"
first LCDs in Finland 500€ euros and it has a 1024x768 main resolution.
 
Panels have a native resolution that is determined by there pixel configuration and density, its a physical Charistic

higher resolution = smaller more tightly packed pixels if you lower the resolution suddenly you have less detail because you are operating below the panels native resolution

if I have a array of 8x8 pixels and a image thats 4x4 I need to stretch that image to fill in the 8x8 unless I use blackboarders same concept with lcd resolution
I can't change the size or amount of pixels in the panel do I need to stretch or shrink the image to fit so if you drop to 1080p you will have a bunch of pixels that you will need to scale to fill the screen

upscaling is easy because all you need todo is scale the content to the panel and not scale the panel to the content which is impossible because the panel can't physically change
 
Last edited:
At you, that's true. Because all I have been saying all along is that when you scale down an image, the quality is degraded. It is you and Vya who are arguing that such scaling and interpolation results in "perfect" rendering of the image. And that is simply not true. So I fail to see why you keep arguing about how great scaling and interpolation is when you yourself admit the image is blocky.

It seems you just want to argue while at the same time admitting the scaled down image is blocky. That just makes no sense.

I admit monitor makers are good at what they do. But ask any professional photographer or CAD/CAE/CAG designer/programmer if blocky curves are acceptable and unnoticeable and they are going to say, "No!"

Are they fine for most gamers? Probably. I mean where would Minecraft be if not fine?

But the OP asked why can't LCD displays use non-native resolutions and still produce "all resolutions crystal sharp"? The answer is because it is not "physically" possible - regardless what you and Vya believe and want everyone else to believe.
Ok, you're still arguing at cross-purposes. My explanation really couldn't be clearer, so I think we'll just leave it there.
 
Hello
LCD=flat screens have only 1 main resolution and all other ones are blurred.
In 2018:
Has anyone yet invented a modern LCD flat screen that offers all resolutions crystal sharp ?
- Like 320, 480, 640, 1024 and such...

Why trillionaires don't invent such screens ? Waited for 15 years...Why no progress ? :banghead:
If you'd understand how an LCD works, you wouldn't be asking those questions. You'd also know there are several resolutions that can look sharp on current LCDs, but you probably don't want to use them. Save maybe for FHD on a 4k screen.

So instead of banging your head against walls, maybe try a little bit of reading instead?
 
So to play Warcraft 1 and Carmageddon 1 640x480 and 800x640 I need that old big 10-20 kg mammoth CRT ? ( Images on CRT-screens looks like a bubble / barrel distortion. :( )
- Or is any modern flat screen able to make clear sharp pixels with those 1990s as native resolution ? Screen size at least >15".

I bought in 2004 a ViewSonic VG500 15"
first LCDs in Finland 500€ euros and it has a 1024x768 main resolution.

Or you can just run the application in a Window at its native resolution. It is going to be small. But then you have what you want. Heck you could even run Carmageddon and Warcraft 1 side by side on a single screen :P
 
Flat screen CRTs in future ?
Will CRTs ever come back as flat screen technology ?
So they can show all kinds of resolutions sharp and full screen ?
 
Flat screen CRTs in future ?
Will CRTs ever come back as flat screen technology ?
So they can show all kinds of resolutions sharp and full screen ?
No, it's a dead technology that's been superceded by better ones.
 
The problem will be solved when the pixel densitet is so high that the best eye in the world cant see it, but for now a 4K/5K 24"(or lower size) is the best you can get

FHD on my UHD 28" looks as good at as it does on my wifes 27" FHD screen, so no AA crap on that one.
 
Back
Top