On Wed, Feb 4, 2026 at 10:47 (-0500), Felix Miata via tde-users wrote:
Jim composed on 2026-02-04 10:42 (UTC-0400):
On Wed, Feb 4, 2026 at 05:52 (-0500), Felix Miata wrote:
Fonts tend to scale best from size to size when DPI is set to a multiple of 24, or at least, 12, so 192 or 168 or 216 or 204 may be better selections.
Felix,
do you have any authoritative reference for this that you would care to share? I've seen comments like this before, and every time I try to track an authoritative basis for this, I never find anything except unsubstantiated repetition of the same claim. (That is, if it gets repeated often enough it takes on a life of its own.)
I'd love to find out why this 24 (or 12 or whatever) is special.
Computer fonts at least originally last century, were tuned to 96 DPI, even though few computer displays offered anywhere near that display density. Medium fonts were nominally 12pt, so at 96 DPI, 16px, which website designers found to be enormous, and so they were on those low DPI screens of 1024x768 or worse.
? The DPI of a 1024x768 screen can be quite high, if the screen is physically small.
24 and 12 are major harmonics of 96, which translate into more linear size progression than lesser harmonics, or fractional values.
I don't follow you there. Would you care to elaborate a bit more?
It's related to why 96 as a DPI frame of reference, or baseline, coming from Windows, when Unix used to provide a choice between 75 and 100, and inherited by Linux, as well as the differences between sizing fonts in points, which used to be the only measurement for font size,
Really? I recall choosing between things like 7x13 and 9x15 fonts a long time ago. (Measured in pixels, not points. For better or worse.)
to sizing on computers in pixels, the bastardization of computerdom responsible for scaling and lack of scaling issues in computer UIs.
Yup, the decision to size things in pixels was (for almost all things) a horrible decision which continues to cause pain so many years later.
Were everything sized in ems instead of px, the computer would do the computing instead of the programmers, and scaling would be automatic based upon simply picking a baseline em size.
And this is one of my points in my previous message... I have been specifying my fonts for various programs in terms of points for a long time now (as opposed to pixels), and that makes sense when the system knows what the actual DPI of the screen is. As opposed to idiotically setting the DPI to 96 and then forcing every program to somehow do an end-run around this lie.
Real points, as opposed to logical points, which no modern web browser any more supports, depend on display density. Pixel to points conversion thus varies by density as well. I did an enormous amount of font testing after creating my website, which no longer exists due to spammers and ISP interference, over two decades ago. It's in the individual font definitions. From one size to the next, best results come from a linear progression. In the smallest sizes, 9-16px, or 6.75-12pt, there simply aren't enough pixels in a glyph box to get a linear progression from one size to the next. A "10px" glyph box only has about 50 dots to work with, barely enough to get a recognizable lower case character that barely uses half the box.
A few years ago when I was looking for a new laptop lots of laptops came with 1024x768 screens. Given that in 2001 (+/-) I bought a 15" laptop with a 1400x1050 screen, it seemed to be that the manufacture and sale of 1024x768 screens was a crime against humanity. If someone had marched all the workers out of the factory that made those screens and blown the factory up, they could have been up for a humanitarian award.
As density increases, requiring more px to keep same physical size, increments get more linear, as pixel quantity is a function of squares. By the time DPI increases by about 25% beyond 96, or 120, you take a 12pt/16px font, which is typically an 8px X 16px box, or 128 px, to 12pt/20px box, or 200px. That's 56% more detail going into linearity just going up 25%. So IOW, if you're running a genuine hiDPI display, the importance of multiples of 12 or 24 effectively disappear, so becomes moot for users like OP running upwards of 144 DPI. At 141.7 DPI, I wouldn't have the eyesight required to notice the effect, which on the 90 DPI or less 1024x768 or 1280x1024 CRT screens and neighbors to 10pt/13px fonts of yesteryear was blatant.
This is all fine. But none of it really explains (to me, anyway) the magic of why the DPI wants to be a multiple of 24 (unless maybe in the case of bitmap fonts). Vector fonts are going to have to be mapped to a rectangular grid of some number of pixels, and I really don't see how this mapping of the glyph curves to the grid is helped when the number of pixels per inch is some multiple of 24. How is an inch special, as opposed to (say) the number of pixels per centimeter being a multiple of 24?
Jim