E. Liddell composed on 2020-01-28 19:28 (UTC-0500):
Include the whole modeline in your xorg.conf, as in this example:
Don't...
Section "Monitor" Identifier "Monitor0" Modeline "1280x1024_60.00" 108.88 1280 1360 1496 1712 1024 1025 1028 1060 -HSync +Vsync Option "PreferredMode" "1280x1024_60.00" EndSection Section "Screen" Identifier "Screen0" Device "Card0" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1280x1024" EndSubSection EndSection
Xorg is every bit as capable of generating the modelines it needs as is gtf or cvt. All it needs it good information. Normally it gets it from EDID. When the EDID is bad, provide the info it needs to calculate from.
Normally the rates come from EDID, e.g. as shown by these Xorg.0.log excerpts: (II) modeset(0): Using EDID range info for horizontal sync (II) modeset(0): Using EDID range info for vertical refresh
Commonly the screen section is purely superfluous when appropriate specs are provided in monitor section.
They are in the display's manual or on the internet, or a result from 'hwinfo --monitor'.
Never ever have I needed a modeline since Xorg was forked off of XFree86. Xorg is smart enough. It will automatically supply other supported modes when called for instead of you needing to get cvt out and again modify xorg.con*.
Assuming xorg.conf is even the right way forward, which here I doubt, modelines and cvt are virtually certainly unnecessary (and /ancient/) manual configuration components. OP's xorg.confs are loaded with unnecessaries. None of the input or font data or comments are relevant to video mode employed. Unlikely BusID or explicit driver specification or display subsections are needed either, or bit depths.
Following as xorg.conf should be sufficient, e.g. (for a native 1920x1200 9 year old 24" NEC; adjust to actual specs before attempting use):
Section "Device" Identifier "DefaultDevice" EndSection
Section "Monitor" Identifier "DefaultMonitor" HorizSync 31-77 VertRefresh 56-61 Option "PreferredMode" "1920x1200" EndSection
Section "Screen" Identifier "DefaultScreen" Device "DefaultDevice" Monitor "DefaultMonitor" EndSection
In OP's situation, manual configuration is almost certainly a workaround for a problem better fixed than worked around. There's no good reason for needing intervention to have a display use it's native mode. There probably isn't a GPU made in the past 15 years at least that can't handle 1920x1200 entirely through automagic, or with a tiny bit of help in the form of HorizSync and VertRefresh. Thus what's really needed is the specs involved (inxi -Gxx), and Xorg.0.log, and possibly the journal, to find the source of the problem.