[argyllcms] Re: Modifying internal display LUTs

  • From: Knut Inge <knutinh@xxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Fri, 28 Oct 2011 09:27:17 +0200

On Fri, Oct 28, 2011 at 9:05 AM, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:

> Right, but they combine it with over sampling to get an effective
> 16 bit or better resolution over audio bandwidth. The visual equivalent
> would be to use higher resolution displays and trade the extra
> spatial resolution for level resolution. They don't do that. What
> they do instead is more analogous to 8 bit audio dithering, where
> the dithering is in the hearing range, and the result it
> a very audible hiss that may be better than straight 8 bit quantization.


At what spatial resolution does a display start to be "oversampling"? In one
strict interpretation, you could demand that they added an optical blur film
that limited the resolution. In practice, I think that my 27" 2560x1440 at
regular distances is to some degree "oversampled" in that limitations in my
vision means that I cannot see the difference between neighbor pixels:
p1 = [25 25 25], p2 = [25 25 25]
and
 p1 = [25 25 26], p2 = [25 25 24]

And watching still-images, there is no temporal information to represent. If
a pixel p1 wiggles temporally between [25 25 26] and [25 25 24] at a rate of
60Hz or more, I am guessing that it wont detract a lot from my experience.

In other words: high-frequency, low-level spatial noise can be injected into
the system with little loww of perceived quality. If display manufacturers
are able to do this and increase the total quality vs cost, then I am all
for it.

But most video processing seems to take 8-bit input, do some intermediate
calculations, then do a straight rounding to 8 bits again. Probably not
optimal.


> Even if that's the case, then unless they document it, or unless
> there is a utility that can exercise it, it's pretty hard to figure
> out how to driver it.
>

Yes. But it seems whenever I have a problem like this, there is some eager
open-source developer out there that chose to solve the problem herself. I
should be a lot more generous with my donations :-).

Maybe I should pester Dell support. I actually did that with my HP printer,
had to call them on an expensive support line, wading through an endless row
of know-nothing first/second-line supporters who suggested that I use
"red-eye removal" when I was actually asking why the HP paper profiles
disappeared when I upgraded to Windows 7. In the end, I met a helpful lady
who could tell me that my printer was not targeted towards photography, and
they had chosen to remove paper profiles when porting its driver to W7.
Money spent, problem not solved...


>
> > Again, true. I have a Dell display that can accept 10-bit over
> Displayport,
> > and ATI-cards seems to have supported 10-bit in some form for ages. But
> your
> > application needs to use OpenGL to access it (at least on Windows), and
> > Lightroom does not do that.
>
> You're confusing frame buffer with LUT. There has been 10 but LUT support
> for some time.


Thank you for the clarification. But for the ideal of having
Lightroom/Photoshop color management that have complete knowledge about the
image capture characteristics and the display characteristics via profiling,
then applying whatever ideal mapping at one place only, you ideally would
want to extend the high-resolution pipeline all the way into the CM-aware
application, right?

-k

Other related posts: