Gerhard Fuernkranz wrote:
Well, Graeme, it's actually not the camera which produces XYZ values outside the visual gamut (including negative ones), but eventually it is always the matrix. If the Luther condition is met, then there exists (in the ideal case) indeed a single "correct" matrix which is entirely determined by the camera, but if the Luther condition is not met, then the choice of the matrix is is a trade-off between several objectives and evils anyway, and it is no longer the camera only, which determines the matrix. It would be for instance certainly possibly to constrain the matrix in order to avoid that any RGB triples inside the camera's spectrum locus map to XYZ numbers outside the visual gamut - then we should never end up with imaginary XYZ values, even if the captured scene contains the most saturated colors one can imagine (i.e. monochromatic ones).
Hi Gerhard, OK, let me be more precise: Given a bunch of real cameras, I wonder how common it is for the cameras spectrum locus to fall outside the standard observer spectrum locus, when the matrix has been created by minimizing the (say) sum of squared errors of a representatively weighted basket of real world color spectra. Given that such a weighted set of colors is likely to weight highly saturated colors fairly lightly, I wouldn't expect the accuracy of such colors to be as high as the more common colors, and therefore their errors could well take them outside the standard observer spectrum locus. While it's certainly possible to change the matrix optimisation objectives to minimize the camera spectrum locus area that falls outside the standard observer locus, by definition this will be at some cost to accuracy. The cost could be small for a large improvement, but it will depending on the details. Given that the spectral sensitivities of the camera sensors and real world spectra is not available, I wonder what a suitable criteria would be to add into the optimization goals ? Would some weighting against the camera primaries falling outside the standard observer spectrum locus be the best approach ? What if the camera deliberately modifies the RGB colorspace to be represented by imaginary primaries (to avoid gamut limits) though ? But the bottom line is that input devices have unavoidable inaccuracies in their emulation of the standard observer, and therefore all colors will have errors. Colors near the standard observer spectrum locus with errors may well fall outside the locus. These values still represent real world colors, and so should not be arbitrarily clipped. cheers, Graeme.