I have been trying and failing to get my three monitors, two of which are ancient and all of which are different, to produce matching colors. I'm using the NVIDIA non-free driver.
For purposes of this discussion let's ignore the hardware color controls (which are inadequate).
I can use the NVIDIA X Server Settings utility which has excellent controls for R, G, and B but intermittently tells me that something else has changed the color settings.
I can use Trinity Control Center / Peripherals / Display but the Monitor Gamma function affects all three monitors simultaneously. Unlike other gamma controls that seem to prefer a number near 2.2, this seems to prefer a number near 1.
I can use Trinity Control Center / System Administration / / Monitor & Display which purports to control the three monitors separately but the instant I move any gamma control by the smallest increment my right-hand monitor (no matter which monitor I have selected) looks like a 1980's CGA display. (This can be fixed by rebooting or by starting the NVIDIA X Server Settings utility.)
There is also Trinity Control Center / Peripherals / Color Profile which I have not used, and where everything is unchecked in hopes of not confusing myself even further.
I have been unable to find anything online by searching, although some CGA-like scrambling was mentioned in passing in this thread: https://www.spinics.net/lists/trinity-users/msg11883.html
Can anyone point me to information on how TDE gamma controls work, how they are supposed to be used, and/or how to stop whatever in the background is overriding NVIDIA X Server Settings?
Thanks!
--Mike
said Mike Bird via tde-users:
| I have been trying and failing to get my three monitors, | two of which are ancient and all of which are different, | to produce matching colors. I'm using the NVIDIA non-free | driver.
Can't be done. Not even with one of those cool, expensive gadgets that read color off the screen and provide details. *Might* be done if your monitors of the same model, made on the same day in the same batch, but even that is not certain. I've spent many hours trying to do this -- I've made pictures for a living for a long time. There are expensive trick calibrated monitors, but even they are of limited value unless you're doing pictures for print. That's because if you're doing pictures for online display, whatever you put up is going to be subject to the vagaries of the viewers' monitors.
I have two fairly expensive 27-inch monitors here, one above the other. (The combination of the Nvidia software and XR&R lets me have one apparent 1920x2160 screen.) One is irredeemably warmer and of less contrast than the other. Which is to some extent a blessing, because when I get a picture right in one I then look at it in the other to get a sense of the variety of screens that are likely to be used in viewing it.
There is a degree of hope. For my DIY smart television project I got a cheap -- $160 -- 32-inch ONN monitor at Walmart last week. First, it has as close to usable onboard controls as I've ever seen: a little joystick on the back. Second, and more important, it has DDC plus extended commands from the computer. This means I can adjust the monitor settings from the computer itself, rather than have the computer and the monitor competing. It *should* give me a recipe that can then be taken from monitor to monitor, though I haven't tested the result of that.
Then there's the fact that monitors themselves, expecially their illumination, degrades over time and to make it worse it does so at different rates not just monitor-to-monitor but specimen-to-identical-specimen. So identical settings on identical monitors are likely to change, at different rates, over time.