[argyllcms] Re: Strange contrast clipping during calibration

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Sat, 25 Aug 2012 11:47:29 +1000

Gerhard Fuernkranz wrote:
> granted that the display behaves at least monotonic, mightn't it be better to 
> search for the
> brightest WP with the desired chromaticity, subject to the constraints 
> max(R,G,B) = 1, instead
> determining the clipped WP from a coarse (possibly too inaccurate) model?

Hi Gerhard,

That's effectively what I'm doing. The "coarse" model was just device curves + 
matrix == primaries.
But if the curves don't reflect the actual behaviour, then I'm suspecting that 
they
were leading to a false gamut boundary. I've removed the curves, so it
should just depend on the primary colors + assumption of additive mixing.
I'm not sure why the curves would be that inaccurate (since they are
directly measured), and this "coarse model" is what gets used for the
matrix model out of dispcal, but without the particular display to
play with, that's my best guess at the source of this problem.

> And I could imagine even more weird behavior of display-internal color 
> corrections. Basically, in
> order to deal with such flaws, one would need to determine the largest 
> monitonic RGB subset
> [0..Rmax,0..Gmax,0..Bmax] first, and let the calibration limit the RGB 
> numbers sent to the display
> to this subset. ]

Right, so response behaviour (curves) would need to be used. The current 
modelling
assumes monotonicity, which could be the source of this particular problem. 
Allowing
for non-monotonicity makes things a whole lot more complicated though.

Graeme.

Other related posts: