• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Debunking the video screen dpi myth

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Whilst I've known for years that dpi (dots per inch) doesn't relate to pictures displayed on a video screen, but only to printers and scanners which work with physical media, many people are still confused by it, especially as Windows has a "dpi" setting to change the font size on the screen, reinforcing this misconception. This article does a really thorough job of explaining this and debunking the stupid dpi myth.

We still frequently hear the very bad advice: "Computer video screens show images at 72 dpi, so scan all your images for the screen at 72 dpi". This is incredibly wrong; it simply doesn't work that way.

Regardless what you may have heard almost everywhere, there is no purpose or reason for 72 dpi images on computer video screens or web pages. As a concept, 72 dpi is simply a false notion. It is a myth. It is NOT how video systems work, and it is detrimental to understanding how things really do work. It is very easy to show proof of this here.

If you want to know why, you're in the right place.

Read the rest at http://www.scantips.com/no72dpi.html


Mods: there doesn't seem to be an appropriate forum section for this subject, so I've put it here.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Has anyone actually managed to get through the whole thing? I finally did and almost lost the will to live.

What he says is right, but damn, does he labour the point over and over again ad nauseam!
 
Joined
Sep 4, 2008
Messages
875 (0.15/day)
I think this fellow has just lost the plot a little.

I'm not saying WRONG, I'm just saying perhaps not entirely on track with the origin of this supposed myth.

To the best of my knowledge (and I my self may be wrong) this 72dpi screen standard came about in the very early days of widely available home printing - a group of experts got together and decided to do the maths and averages and come up with an effective approximate average DPI that would result in printed images coming out of printers at sizes similar to what they would be on the average displays of the time.

An average guesstimated standard to try and make pictures print more or less the size they looked on the technology of the day.

And that was it.

It had nothing to do with websites or video or anything else which wouldn't even formally exist in the public domain for another 10 years minimum.

And it wasn't a RULE either - just a suggested guideline.

What some few fanatical groups may have done with that suggestion over time is entirely up to them.

Very much like the nonsense about the human eye only being able to see 30 fps or whatever and that's why video standards in USA are 30FPS - that too is total bullhonkey - that also originated very long ago when the very first transmission and display standards where being developed - when deciding how many frames per second to transmit - and how much bandwidth to dedicate to the signal it was decided that the CRT displays of the time - based on the light response rate of their internal coatings where not capable of responding to distinctly different images faster than roughly 30 frames per second - and thus if any more bandwidth was dedicated to an even faster frame rate it would be pointless because you would see no difference.

That is you would see no difference because they where very early primitive CRT displays that couldn't keep up - it had NOTHING to do with the abilities of the human eye.

Again people just took this advice and ran with it in any direction they felt they could.

EDIT : Although I'd say his opening tip is still very valid - don't scan at 72DPI it looks terrible on today's systems :) at very least scan at 300DPI (or more if you can) and scale/resample/resize it down to fit - the process of mathematically down scaling an image can give it a richer look than scanning directly to whatever size you intend to use anyway... because of maths... and science... :) (I'm sure you already know this qubit - I'm just speaking generally)
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
According to the article, 72dpi was first defined by Apple, because it fitted in with the size of their monitors at the time to achieve the same approximate size as on paper. I think he was thinking of the all-in-ones like the Mac SE with a 9 inch screen, although he never specified a model. It's still an arbitrary number though. They just said a character so many pixels high was 72dpi. It can be considered a sort of software scaling factor for rendering characters: if you know the size of the screen, then you can make the characters the same-ish size as on paper regardless of the video resolution. The video subsystem still doesn't care about dpi though - it's still all pixels, so he's still right.

And yeah, he lost the plot all right. You could see his frustration coming through at people getting it wrong all the time and really wanted to hammer the point home.

As far as broadcast video is concerned, let me start by defining a couple of terms. Since the picture is interlaced, you get 60 fields per second, making 30 frames per second (50/25 in the UK). This gives smooth 60fps motion, but with resolution artifacts related to the motion due to that interlace. At the time it was considered an acceptable compromise to halve the bandwidth required.

Don't you wish they'd just done it properly back then and gone progressive? Today's digital hi def transmssions are still interlaced due to this historic decision and even though the TV or set top box can be configured for progressive output, the conversion still costs some clarity. This is why I watch my TV at 1080i even though I don't like the interlace artifacts.

From what I read recently, 60Hz was defined, because it matched the US mains frequency and served as a timebase for the TV to lock onto the incoming signal. This was because electronics were still in a primitive state back then (1920s/30s) and a completely independent sync circuit could either not be made, or was probitively expensive for a consumer product.
 
Joined
Sep 4, 2008
Messages
875 (0.15/day)
Well luckily I grew up in a far away archaic continent that as it was always the last to receive any new technology, often landed up avoiding all of the early adopter mistakes - so I have lived in the land of full progressive my whole life. (We just had to wait 10 years longer before we got colour TV :) and when the internet finally did arrive it took a moment to figure out why all American video looked so very terrible.

(no offense America but interlacing is possibly the worst mistake in all of video history - although in all fairness it was never intended for displays sharp enough to ever be able to tell it even existed - or even for colour broadcast at all for that matter)

The same echoed in things like internet standards, cellular band ranges, and so on. Being the last often met getting the latest revision.

But yeah on the video and the DPI it goes bay long before things like interlacing where even in use.

Apple used the 72DPI standard, but the standard (or at least the concept of having a calculated standard) was adopted years before by groups like IBM in an effort to make data printouts relate-able to the display (for all the pre-discussed reasons) going back to those giant database machines that looked like fitted kitchen units that they sold to big business.

And obviously the conceptual fps limit on CRT's was established long before the NTS Committee even got to ruining panning motion for everyone with their cursed interlace, or any transmission signal format for that matter with the observation that electrostatic discharge of the phosphor coatings in the CRT's of the day had a certain minimum cycle (who knows what it was exactly - probably something around 35~45hz meaning that if you cycled each field 30times per second it would have plenty of time to discharge before the next cycle and persistence of vision could to the rest) , and any form of signaling (of any kind) faster than that discharge cycle was essentially pointless as the screen would simply begin to glow all over like a giant over engineered light bulb.

(And yes the frequency of the mains power supply was used a lot in early days too - or at very least once they had thought of it a lot of devices where specifically designed around it as quartz crystal oscillators where still a way off)

And even that was again only relative to the technology of the day and intended only as a guideline at the time - within a few years advances in CRT coating and ray calibration had made displays capable of far faster response but the standards had been set by then - and the notion that "there is no point going any faster because you wouldn't be able to see it on these terrible old screens" slowly mutated into the ridiculous nonsense that "there is no point going any faster because your eyes cannot tell the difference" when in fact the human brain actually forms images not the eye and it does so by composting more than just one set of data - whilst in panning motion of similarly illuminated and textured content there is an average cut of point at which the human eye begins to fail to resolve subtle changes, this actual cut off point varies very vastly from one individual to another - and is strongly affected by the intensity of light in the scene. In a darkened environment they human eye can clearly detect flashes of light less than 1/5000 of a second in duration (or so I read on what was a Berkley article if I'm not mistaken - I'm no eye specialist myself :) . Or to put it another way under ideal conditions we are at least technically capable of distinguishing certain events faster than 5000 FPS :) although don't expect to read names engraved into bullets as they wizz past you...

Better yet just don't get shot at.

Especially by people who engrave names into bullets.
 
Last edited:
Top