[argyllcms] Re: Display measuring at maximum brightness beneficial?

  • From: János, Tóth F. <janos666@xxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Wed, 22 Jun 2011 16:44:46 +0200

I tried something like this some months ago.

I placed my ColorMunki on the display
I set the desired brightness and white point with the hardware controls.
(The RGB Gains can also change the brightness...)
I removed the CM from the display
I kept it running for ~30 minutes until it warms up to operating temperature
I attached my CM again and check if the white point settings need to be
fine-tuned
I increased the brightness until I hit ~300 cd/m^2
I kept the sensor on the display for another ~30 minutes to let it warm up
while the display itself warms up again
I calibrated with native white point and "the pure-power gamma which is
closest to the native TRC" targets and profiled it with a "single gamma +
matrix" style profile
I decreased the brightness back to the initial calibration settings.

But there was a problem, it seems that the LCD panel behaved slightly
differently at the significantly higher temperature. Or it was the
colormunki which didn't like the high temperature. But the overall result
didn't seem better at all, may be worse or equal but different.


May be it would be a better practice to characterize a colorimeter (which is
good at low luminance levels) with a spectro (which was just attached to the
display, so it's close to the room temperature...) and use that colorimeter
for the calibration and profiling in non-contact setup (so it can't warm up
-> or characterize an already warm colorimeter with the cool spectro...).


What is the assumed low light accuracy of the i1d2/i1lt hardware?
I am thinking about picking up the new ColorMunki display with it's ~0.01
cd/m^2 but that would be almost useless until ArgyllCMS will support it.
The i1LT is a bit cheaper and already supported. I need to go down to only
~0.04 cd/m^2 with my current display.

Other related posts: