[argyllcms] Re: NEC monitor controls

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Tue, 29 Jul 2008 01:47:51 +1000

Adrian Mariano wrote:
I got the impression from the Argyll documentation that the LCD displays are doing something similar to a calibration if you change the temperature or the RGB settings.

It depends very much on the technology of the display. I think the
advice is largely true for current typical LCD displays, but there
may be exceptions, and really high end displays, as well as new
technology displays when/if they come along may not be implemented
this way. One way to verify what's going on may be to use the
"dispcal -v -R" report on the uncalibrated display option
to see if the effective bit depth is compromised as you
move the monitor controls from their defaults. It may be
that this test is not sensitive enough to notice anything
less than a 2:1 loss of precision though.

The advice seemed to be to set the display into the most native possible state and then do all your own calibration once, which would get the most out of the available bits. Is this not true? I set my display to native whitepoint and turned the brightness setting down to get 170 cd/m^2, and then calibrated it to sRGB (6500K) rather than using the sRGB setting on the display. Would I be better off to switch the display to sRGB? (How would I tell which approach is better?)

Only by trying it, and evaluating the two options. You really want a nice
smooth grey wedge to look for quantization type effects.
Note that the Argyll timage utility is capable of generating some
basic TIFF test images. The default test chart (ie. "timage cube.tif")
is one place to start, if you have a utility capable of displaying
the image without applying color management to it (ie. just the
installed calibration).

Another note is the proper choice for the "sharpness" setting. Someone in an earlier message said "use the factory default" based on a vague reference. I encountered the following test pattern:

http://www.lagom.nl/lcd-test/sharpness.php

and when I tuned the sharpness based on this test pattern I set it to 8.3%, which is significantly below the factory default setting around 26%. (I have a NEC 20WMGX2.) Of course, I don't have a real understanding of what this setting is doing and whether this particular calibration direction is right, but the effect on this test pattern is very clear.

It might be that the factory default enhances the sharpness, since it
makes the display "look good", and that helps sell more displays.

Graeme Gill.

Other related posts: