[argyllcms] Re: AW: Re: AW: List of wide gamut displays

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Fri, 14 Oct 2011 18:59:32 +1100

Jens Heermann wrote:
> Hardware calibration offers the possibility to make corrections directly in
> the monitor controller and to adjust the internal LUT (look-up-table). This
> LUT offers a resolution of 14-16bit, in the newest models even 10-14bit per
> channel (RGB, 3D-LUT), which means that you have up to 16384 (14bit)
> supporting points for a correction instead of 256 (8bit) as you have in the
> video card, due to limitations of the OS. This can lead i.e. to massive
> tearing in gradients.

I'd take the "14-16" bit claims with a fair dose of salt. It sounds like
a numbers game to me, rather than something that is proven to be necessary
or even implemented in a way that has any visible effect. For instance,
the display panels themselves rarely have more than 8-10 bits of precision
(a lot of Apple displays are/were 6 bit!), so anything more has to be faked 
using
dithering, and real world imagery simply doesn't have that sort of S/N
(Signal to Noise ratio) that needs such high resolution. Instrument
measurements don't have anything like that S/N ratio either, and a
general 3D transform has to use 3D luts, which will have very limited
resolution due to memory size constraints.

So while it's nice to have some extra precision available at certain parts
of the processing, you can't assume that it will result in any visible 
difference.

> The other thing is: while calibrating your device to a specific condition
> (i.e. whitepoint D50), you have to "deform" the original gamut to fit the
> desired target. 8bit-calibration will  limit the range for modification and
> corrections and will definitely shrink the gamut to nearly 2/3 of the
> original. 

This is simply incorrect. The precision has nothing directly
to do with the gamut. By scaling the brightness down to fit
the white point shifted space, the gamut will not be clipped
or "deformed".

[This is exactly why you may see the message
"Had to scale brightness from XXX to YYY to fit within gamut"
from dispcal on occasions]

Yes there can be quantization issues if all you've
got is 8 bit precision to make large white point shifts.

> Also the backlight is often not optimized for D50 and therefore
> print-related tasks, therefore you will lose much of the gamut to the
> deformation towards D50. This is mainly the reason why LCD-TVs cannot be
> used for high-quality softproofing. Their backlight is optimized to video
> standards, not to print standards.

See above. This is not to do with color gamut, but with brightness and
quantization.

> So, the only way to preserve most of the technical possible gamut offered by
> the panel is to perform a hardware calibration.

Internal calibration is no different to display card calibration in terms
of available gamut, since it will have to work within the same limitation
of the backlight color and maximum channel intensities. It may have an
advantage in terms of scaling the primary color ranges without introducing
excessive quantization, but often in such cases the internal calibration
can be used to shift the white point using on screen controls while still
retaining full 8/10 bit precision over the video interface, allowing full
precision external calibration. If the display has a variable color
RGB backlight, this can be taken advantage of by both internal and
external calibration.

The major advantage that external, display card calibration has is that
it is universal and standardized. Unfortunately, the same cannot be said
for internal calibration.

Graeme Gill.




Other related posts: