Jim composed on 2026-02-04 10:42 (UTC-0400):
On Wed, Feb 4, 2026 at 05:52 (-0500), Felix Miata wrote:
Fonts tend to scale best from size to size when DPI is set to a multiple of 24, or at least, 12, so 192 or 168 or 216 or 204 may be better selections.
Felix,
do you have any authoritative reference for this that you would care to share? I've seen comments like this before, and every time I try to track an authoritative basis for this, I never find anything except unsubstantiated repetition of the same claim. (That is, if it gets repeated often enough it takes on a life of its own.)
I'd love to find out why this 24 (or 12 or whatever) is special.
Computer fonts at least originally last century, were tuned to 96 DPI, even though few computer displays offered anywhere near that display density. Medium fonts were nominally 12pt, so at 96 DPI, 16px, which website designers found to be enormous, and so they were on those low DPI screens of 1024x768 or worse. 24 and 12 are major harmonics of 96, which translate into more linear size progression than lesser harmonics, or fractional values.
It's related to why 96 as a DPI frame of reference, or baseline, coming from Windows, when Unix used to provide a choice between 75 and 100, and inherited by Linux, as well as the differences between sizing fonts in points, which used to be the only measurement for font size, to sizing on computers in pixels, the bastardization of computerdom responsible for scaling and lack of scaling issues in computer UIs. Were everything sized in ems instead of px, the computer would do the computing instead of the programmers, and scaling would be automatic based upon simply picking a baseline em size. Real points, as opposed to logical points, which no modern web browser any more supports, depend on display density. Pixel to points conversion thus varies by density as well. I did an enormous amount of font testing after creating my website, which no longer exists due to spammers and ISP interference, over two decades ago. It's in the individual font definitions. From one size to the next, best results come from a linear progression. In the smallest sizes, 9-16px, or 6.75-12pt, there simply aren't enough pixels in a glyph box to get a linear progression from one size to the next. A "10px" glyph box only has about 50 dots to work with, barely enough to get a recognizable lower case character that barely uses half the box. As density increases, requiring more px to keep same physical size, increments get more linear, as pixel quantity is a function of squares. By the time DPI increases by about 25% beyond 96, or 120, you take a 12pt/16px font, which is typically an 8px X 16px box, or 128 px, to 12pt/20px box, or 200px. That's 56% more detail going into linearity just going up 25%. So IOW, if you're running a genuine hiDPI display, the importance of multiples of 12 or 24 effectively disappear, so becomes moot for users like OP running upwards of 144 DPI. At 141.7 DPI, I wouldn't have the eyesight required to notice the effect, which on the 90 DPI or less 1024x768 or 1280x1024 CRT screens and neighbors to 10pt/13px fonts of yesteryear was blatant.