On Wed, Nov 23, 2022 at 00:51 (+0100), deloptes wrote:
Jim wrote:
> Heresy and rubbish! The higher the resolution,
the better (to a
> point, anyway).
Really, I thought it is maths :)
I think you did not read my email, or maybe you didn't understand it.
cause how many dots in which density I can display on
15" ... of
course it is much smaller then the same displayed on 27".
(Or maybe I am misinterpreting what you write; I can't figure out with
any certainty what the above is supposed to mean.)
To read the same text on the 15" I must put
glasses or lower
resolution. On the 13" I really need glasses :) In math is truth :)
The reason the text is so small is that the X server (probably) lies
to your font rendering library about the actual DPI, so the font
rendering library uses a smaller number of dots to render (say) a 12pt
font **than it should** at your actual DPI.
And so since your dots are closer together than what the font
rendering library has been led to believe, the letters appear too
small for you to read easily.
If the truth was told to the font rendering library, it would know
that it need to use more dots than what it is actually using.
You don't want lower resolution on your laptop screen.
You want the font rendering library to know what the actual DPI is, so
that it can draw each glyph with enough dots to make readable text.
And just in case you think so, I'm not going on a theoretical rant. I
have this working on my systems (Slackware64 15.0 and various
Raspbian/Raspberry Pi OS versions).
For example, I have told my terminal emulator to use Inconsolata at 10
points. And the text in my terminal windows looks pretty much the same
actual size (as measured by a ruler) on my three wildly-varying DPI
screens. So I don't need to change my eyewear when going from one
screen to the other.
If you think it is "of course" that the text is smaller on a higher
DPI screen, you are missing the essential point that a 12pt font
is defined by a number of points, not a number of pixels.
Cheers.
Jim