[argyllcms] Re: Displaying sRGB graphics on wide gamut monitors - gamma problem?

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Wed, 27 Oct 2010 23:48:11 +1100

Anders Torger wrote:
Is it possible on a wide gamut monitor (colorants near AdobeRGB) to
display sRGB graphics "correctly"?

Not without a matrix transform to change where the apparent primaries
are located.

Say if my low ambient light mandates
a gamma of 2.4 on the screen to get best sRGB viewing conditions and I
calibrate my screen to that gamma, will then sRGB graphics actually get
gamma 2.4?

It depends on what your definition is. If your definition of gamma involves
the relative intensity of the calibrated or transformed output with regard
to the input, and you have arranged the calibration to have a gamma 2.4 curve,
then by definition it has a gamma of 2.4.

The sRGB white will be the same as AdobeRGB white, but
saturated colours will not. I'm suspecting that due to this there will
be some gamma distortion. While grayscale will use the full monitor
range, saturated colours will not "see" the whole monitor's gamma curve,
but only up to the sRGB color gamut edges.

No, I don't think so. If a primary (say blue) is further along the spectrum
locus, and hence our eyes are less sensitive to it, then the maximum
intensity of that color will be higher so as to be able to achieve
the given white point, so the apparent luminance of saturated colors
will be comparable between a wide gamut and sRGB monitor, and the progression
will have the same shape if they are both  calibrated to gamma 2.4.

Graeme Gill.

Other related posts: