[argyllcms] Re: "Washed" / low contrast colors on calibrated display

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Mon, 07 Jan 2008 12:59:49 +1100

Frédéric Crozat wrote:
After a lot of trials with dispcal, I'm getting my first results but
I'm not really happy with them : images seems to lack vivid colors,
everything look "washed" (sorry if I'm not using the right term).

It's difficult to make these sorts of subjective assessments if
you don't know what colorspace the images you are looking at
are in. As soon as you change a display to be in a different
colorspace, the images "look bad". This doesn't necessarily
mean that display is set badly.

I'm having the same results with different monitors (all LCD, one
medium-level Sony and two entry-level Belinea) , so I'm wondering if
I'm doing everything correctly. I've shown this to our art designer at
Mandriva (since she knows how stuff is supposed to look) and she
agreed with me colors look as if they were supposed to be printed,
which is not the intent for our Mandriva display background (for
instance).

Right, but how have these backgrounds been created ?
For instance, if they've been created interactively on a monitor
set to some other colorspace, they will of course look bad
when displayed on a monitor set to a different colorspace.

I'm also not sure what are the "best" settings for brightness (I know
it is very subjective).

These can all be very long discussions, and have been discussed
at some length on other mailing lists (ie. the Apple ColorSync list).

It boils down to two things though:

   What is the equipment capable of without introducing side effects ?

   What are you trying to do ?

You can make adjustments to a display using it's controls and
the video card LUTs. Generally the former are more powerful
and have less side effects. There can be exceptions though,
for instance LCD's have no native contrast control capability,
only brightness, so contrast is usually faked by manipulating
the lookup curves, which can introduce side effects. Same
with white point control on an LCD (unless it has R/G/B LED
back lighting). So generally LCD displays are much less
flexible than CRT displays in targeting some non-native
colorspace without introducing side effects.

Brightness depends on what you are trying to do. If
you are trying to do soft proofing for instance, you
will have some brightness level in mind dictated by
the hard proofing booth you are comparing to, or
the ambient brightens level. For good color judgement
and low fatigue it's desirable that the display brightness
roughly match that of the ambient lighting.

In terms of what you are trying to do, it comes down to
what colorspace you want the display to be, and how
far from native for that display it is. A CRT can be
reasonably flexible in the behaviour it can be
given without side effects, an LCD less so. If you
want to minimize artefacts on an LCD you want to
set the contrast and white points to their native
values (ie. where the monitor is not manipulating
the digital signal levels). It may not be easy
to figure out what this is. In this scenario
you would probably only want calibration to
set the transfer characteristic and neutral
axis, and leave the white point native.

For typical MSWindows/Linux this would probably
be the sRGB transfer ("gamma") curve, or
a gamma of about 2.2. For OS X it would probably
be a gamma of 1.8.

The nominal white point of a display is D65 (set by
Television standards), and I'd imagine the LCD's native
white point is somewhere near there, but this is dictated
by their backlight color. The CRT's will give maximum
brightness with a much higher white point (9000K or so),
but this can be reduced with fewer side effects (just
reduced brightness) using typical CRT controls.

If you have specific requirements (trying to do
soft proofing) then you may want to target a specific
white point and brightness, and be prepared to
compromise other aspects of the display to achieve this.
By all means use the controls to move the display
in the direction you want to go, and then use
the calibration curves to get there. If you
are moving far from native (especially on an LCD)
you may find the side effects unacceptable though.

(9300K, 6500 and user defined (ie only Red and Blue offset). However,
when I check color temperature with dispcal -R, I get a white point at
5875K (when 9300K is chosen on monitor), 5190K (when choosing 6500K).
So, I'm wondering what is the best setting to choose (and if I should
force a value using -t ) ?

Hard to say why that is. Either the monitor is lying (possible
with an LCD - it's hard to move the white point that much without
loosing significant brightness or level resolution), or the
instrument isn't measuring the spectral content of that display
accurately.

Also, the only way to change black point and white point is to change
the user defined "temperature", so I can't get a low dE for both black
point and white point. Which one should I try to get with the smallest
dE ?

You do what you can, and you decide how far to "push" it. The
white point is more important, but you usually can't do much
with an LCD.

BTW, there is a slight error in dispcal usage (-h) : -q accepts u
(ultra high), but it is not printed in usage ouput.

Sigh. Because people seem to often do silly things when they see
"ultra high". They choose it first and then complain that
"it's really slow" or something. It's there just for me to test
things (to "bracket" the range of the usable settings).

My advice is to always use the default first, and then
move from the default when you have a reason to do so
(to fix a problem, achieve an aim, or explore variations).

Graeme Gill.



Other related posts: