[argyllcms] Re: Gamma wrong for calibration curves?

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Wed, 06 Dec 2006 14:43:42 +1100

william.wood@xxxxxxx wrote:
I was naively assuming that by truncated you meant that the target gamma
curve still goes through 0,0 although the device does not go that low.

You can do that, but really the idea of calibration is to aim
for a bit more precision I think ! The contrast ratio is pretty
important in how things will appear, hence the concentration
on issues such as adjusting the brightness control (CRT) to
adjust the display for a suitable black level. On an LCD
you don't have these sorts of controls, the black is
whatever it is.

Let's call these three methods the "zero offset method", the "output offset
method", and the "input offset method".

[I'll call the first one "assume zero response" method.]
The reason I chose to switch to the "input offset method" is simply
that this better models how a CRT works. The Brightness control on
a CRT does add or subtract an offset to the input signal. The Contrast
control changes the gain of the input signal (paradoxically, these
controls have the opposite visual effect to their names).
LCD's have a "Brightness" control that changes the strength of their
back lighting (so it is the analog of the CRT Contrast control). LCD's
have no analog of the CRT Brightness control, although some displays
fake a Contrast control, by manipulating the shape of their
lookup tables.

The zero offset method allows multiple monitors to be calibrated to the
exact same brightness curve (ideal gamma) if you match the white intensity
of the monitors, without having to match the black intensities also (which
would reduce the dynamic range of monitors with lower black intensities).

Right, but they will probably not look the same near the black end,
even if they do look similar in the light half of their response.

I think it comes down to what's meant by "gamma". The input offset
approach is more correct according to the technical definition
of gamma, since it is measuring the power of the (assumed) power
curve that the display has. I think this is the number a CRT or Video
engineer will give you, if you ask them to measure the gamma of a display.

The "assume zero response" approach is effectively asking
"what ideal gamma curve best matches the shape of the actual curve ?".
I can understand this approach has appeal in terms of a number
representing the gross characteristics of the curve,
but the problem with it (and the reason I switched to the "input
offset approach") is that if you use an ideal curve as your
target for calibration, you will end up with a response curve
that is the same as the ideal curve down to the black level
of the display, at which point it will "cut off" or "saturate",
giving no responsiveness at the very dark end. This would be quite bad.

By adopting as a target curve a model that better represents
the natural characteristic of the display, I am not trying to
force the display to do things that it doesn't naturally do,
and hence should end up with good control over it throughout
its full range, and minimal quantization loss though the
LUT's being used to create the calibration, particularly at
the black end.

Since LCD's are all set up to emulate CRT responses, and are
expected to do so, the same is applied to them, with a
provision that can avoid trying to enforce R=G=B neutrality
at the black point. (An interesting question is whether
a different model would be better for LCD displays).

I was assuming that the actual shape of the curve really doesn't matter
a great deal, since a profile will simply be used over the top of calibration.
For minimum quantization loss in the LUT's, a gamma close to the native
gamma of the display is probably the best choice.

The input offset method assumes a linearly decreasing input offset value
which decreases to 0 when the actual input is 1.  I don't understand the
rationale or benefit of this approach; it would seem to create a mis-shapen
gamma curve?  It creates very different results than the other two methods,
which are very close (for my monitor at least).

What you've stumbled across is the difference between the popular
definition of gamma, and the technical definition. This is explained
quite well in Charles Poynton's "Digital video and HDTV", pages
272 & 273. You're right that it means that displays with the
same technical gamma, can look quite different because their
overall response is quite different, due to differences in the
black level.

Perhaps the best thing I can do is to make dispcal communicate
with users using the popular definition of gamma, and translate
it into a technical gamma + input offset to define the actual
target curve. To fix the "sRGB" and "L*" curve is not so easy.
The calibration with these is also rather dependent on the black
level using the "input offset" approach, and to make them behave
more like the popular expectation can only be done by "cutting off"
their black response, modifying the shape of the curves (thereby
defeating the whole idea of these target curves), or moving to
the "output offset" model for these targets.

Please see the updated spreadsheet below where I have used your estimated
input black offset value of 0.075605 to calculate the gamma error for the
-k1 calibration curve using the "input offset method" (very good now at
0.0001), and also the error for the -k1 calibration curve with the
corresponding profile also applied - this error is now a factor of 53 worse
(0.2066 vs 0.0039 using the "zero offset method").

Right, because the (input) profile has it's own interpretation of
the curve it's trying to reproduce, and with something like an sRGB
profile linked using a "passive" linking scheme (ie. not using
something like Adobe's BPC), it will be trying to emulate the ideal
"zero response at black" curve, and simply cutting off the response
at the low end. Measured on the "input offset" model, the gamma will
be quite different to the stated gamma of the input profile, whereas
measuring the gamma using "assume zero response" will give a figure
much closer.

Of course you will get more detail at the black end if you use
something like Adobe BPC (I'm not sure if they let you use this
for the display profile), or were to link using an "active" CMM
such as Argyll's icclink -G when applying the display profile,
or were to use a source profile with a more realistic black
point than the traditional sRGB profile.

> Doesn't this suggest an
inconsistency in approach between the calibration curve software and the
profiling software and that the profiling software is correcting for the
calibration software, in effect setting the gamma back to what it would be
using the zero or output offset methods? Visually it seems this what is
happening as the image darkens when the profile is applied.

Yes, you're correct, there are issues with regard to black points. The ICC
attempted to address this to some degree with one of the updates that
recommended moving away from a default "perfect black" black point,
but the effects of this change have not fully propagated, and a real
fix means moving to an "active" CMM, something that is a bit rare
at the moment.

Graeme Gill.

Other related posts: