[argyllcms] Re: Modifying internal display LUTs

  • From: Knut Inge <knutinh@xxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Fri, 28 Oct 2011 08:12:23 +0200

Thank you for your reply.
 On Fri, Oct 28, 2011 at 4:47 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

> There are no technical discussions about this as far as I am aware, as
> such things are all proprietary.
>

Ethan Hansen did provide some hints here:
http://www.luminous-landscape.com/forum/index.php?topic=53825.140;topicseen

On Fri, Oct 28, 2011 at 4:47 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

> There is no guarantee as to what internal depth a display supports, or
> how it is implemented
>

Of course. But if some dedicated person is able to reverse-engineer DDC
protocols used and the display seems to accept 4096 values per primary, I
think that is a solid indicator about the size of its internal LUT.

Given a signal chain of:
camera raw->raw developer->OS APIs->GPU->VGA/DVI/DP->display LUT->native
panel

I think that there will/may be a number of stages of
processing/requentization. It might make good sense (from a mathematical
standpoint, perhaps less important from a practical standpoint) to do stuff
as late as possible. The best example I can think of is gamma. If you want
gamma applied to your image, you are basically expanding 8 bits e.g. sRGB
into 12-13 bits linear. You dont want to do that prior to an 8-bit limited
link. In other words, you probably dont want your display to be linear.

On Fri, Oct 28, 2011 at 4:47 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

>  (the native LCD often has limited depth, and extra
> depth is faked using spatial dithering).
>

There is this misconception about dithering. What dithering does is trade
spatial/temporal resolution for amplitude resolution. So if you have a
10-bit input and an 8-bit output you can preserve _more_ of the 10-bit
amplitude information by dithering into 8 bits. But it will cause some
quasi-randomness in either time or space. If the time and space resolution
is large enough that our senses tends to smooth this randomness, then we are
all good.

As a side-note, most CD-players use similar techniques to offer 16 bits of
audio quality on a 1-bit or 5-bit D/A converter. Turns out that
D/A-converters using few bits and dithering/noise shaping can be made better
and cheaper than direct 16-bit ones. Perhaps the same is the case for LCD
displays (the 9th and 10th LSB is relatively unimportant, is it not?)

On Fri, Oct 28, 2011 at 4:47 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

> Only high end graphics/color displays tend to have
> accessible internal LUTs, and there are no standards and none of it
> is public.
>

True, but companies such as Entech and open-source communities such as
ddccontrol seems to have a lot of knowledge, even about proprietary
signalling
http://entechtaiwan.com/
http://ddccontrol.sourceforge.net/

There seems to be hints that even less high-end manufacturers (like Dell)
may have sort-of accessible LUTs that they choose to not reveal. One
argument is that there seemingly are only 2 or a few manufacturers of
display controllers out there.

On Fri, Oct 28, 2011 at 4:47 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

> Note also that analog VGA and Display Port allows more than 8 bit
> precision, and even DVI allows it in theory, using dual links (in practice
> I suspect this has never actually been implemented by any display or
> graphics card.)
>

Again, true. I have a Dell display that can accept 10-bit over Displayport,
and ATI-cards seems to have supported 10-bit in some form for ages. But your
application needs to use OpenGL to access it (at least on Windows), and
Lightroom does not do that.

This would only gain color-management-aware applications, not those that
suppose the display is sRGB.


> That's my general opinion, although there are circumstances when
> another arrangement may be preferable, such as feeding video into
> the display, since few video systems support color profiles.


I agree. Videos dont look good on my system using the native response of my
wide-gamut display, as my video software is unaware of its profile. For that
case, being able to switch automatically to sRGB mode (even if that meant
calling the "sRGB" preset of my display) would be a major ergonomic bonus.



> If you have a wide gamut display though, you are then unable to make use of
> it.


I think it is really hard to make use of the wide gamut even when I put
effort into it. For day-to-day surfing, mail-reading and such I would settle
for an automatic calibrated sRGB response.


> Doing the color management in the computer system has the advantage that
> different color spaces can be mixed on screen (although they may visually
> interfere with each other).


True, but given the state of color, that is way down on my list. I'd like to
get realistic colors for single applications in full-screen every time
without user intervention first.

Thanks for your efforts and time.

-k

Other related posts: