[argyllcms] Re: Strange contrast clipping during calibration

  • From: Gerhard Fuernkranz <nospam456@xxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Sat, 25 Aug 2012 13:36:59 +0200

Am 25.08.2012 03:47, schrieb Graeme Gill:

Gerhard Fuernkranz wrote:
granted that the display behaves at least monotonic, mightn't it be better to 
search for the  brightest WP with the desired chromaticity [...]

That's effectively what I'm doing. The "coarse" model was just device curves + 
matrix == primaries. But if the curves don't reflect the actual behaviour, then I'm 
suspecting that they were leading to a false gamut boundary.

Hi Graeme,

if you "search" the RGB cube anyway (i.e. adjust RGB, measure, adjust RGB, 
measure, ... until optimum is found), what role does the model play then? For 
approximation of the derivatives? (if the search/optimization is guided by gradients)

Btw, one could possibly also consider a 2D search here, i.e.

   * Keep R fixed at 1.0 and search the [R=1,G,B] plane in device RGB space for 
an [1,G,B] triple closest to the desired chromaticity
   * Do the same with fixed G=1.0, and then with fixed B=1.0
   * Use the best of the three results

[ Dealing with non-monotonicity or flat response still remains a special issue, 
as the search may get stuck in a local minimum. ]

I've removed the curves, so it should just depend on the primary colors + 
assumption of additive mixing.

If additivity is granted, then per channel curves should basically not 
influence the CIELAB gamut hull. But is there a need to consider gamut in 
CIELAB space at all? In order to fulfill in-gamut constraints, isn't it 
sufficient to restrict device RGB to the [0...1]^3 RGB cube (regardless of the 
corresponding CIE numbers)?

Best Regards,

Other related posts: