[argyllcms] Re: Average bug/Density curves/ xicclu -g predictability issue

  • From: Elena [service address] <1007140@xxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Fri, 21 Jan 2011 14:03:21 +0100

Hello Graeme

On 20-Jan-2011, Graeme Gill wrote:

>   I think I've covered this fairly extensively in previous posts.
> To summarisz: These are likely the result of two factors, neither of which
> can be solved with the current code: 1) While adjusting the black generation
> curve can smooth the transition to the gamut surface on the neutral axis,
> there is no way of ensuring a smooth transition everywhere. 2) The black
> value on the gamut surface may itself be non-smooth.

Yes - please don't think I have the children's Attention Deficit Disorder,
it's just that we look at the problem of smoothness/continuity from two
different point of view. Mine is that of an intuitive experimenter and
also theoreticist, which some precise experiences; yours is that of a
(very skilled of course) person who actually wrote a code based on some
precise concepts, and with of course different background experiences.
You look at the problem in topology/gamut shape terms, while I perhaps
think at the same but with a numerical and a-dimensional approach.
I'm not here to convince you of changing the way your current code works,
of course. But allow me some hints - or just loud thoughts, if you prefer.

In the early past, when I didn't know of icc profiles and CM workflow in
general, I was very disappointed of the results offered by the RGB driver
of my old printer, because I felt that colors just didn't match well what
I was seeing on screen. I didn't have a spectro those times, just a good
color scanner. After I managed to write a CMYK low-level driver for my first
inkjet printer (first times at reverse engineering epson, hard work!) I tried
a very crude approach for color conversion from RGB to CMYK (very crude,
but playing with those things you always learn very much!). In short, some
testcharts were printed, covering many many thousands CMYK combinations
regularly placed on the 4D grid, just missing the combinations exceeding the
ink limit I imposed, and then scanned. Really there was no color management
used, and the scanner parameters were adjusted by eye to offer the best visual
match between the phisical chart and its RGB scan on screen. My experimental
software, simply and crudely, made a time expensive bruteforce search for
patches in the scanned chart having the least squared (in RGB vectors)
difference with respect to regular points in the RGB cube, assigning
for every RGB node a CMYK value - and the resulting 3D LUT table saved.
Why I'm telling you this story ? Just because I remember well the kind
of discontinuities I encountered with this approach. For example, analysing
a slice, I could be finding one (or more) CMYK steps breaking the main
regular progression, and that just because... at searching time, those offending
CMYK combinations were indeed chosen because the best matching ones!
Now assume (by axurdity) that you convert and print the source RGB image
from the so computed RGBxCMYK lut without any interpolation, i.e resulting
in severe quantization bands (I don't remember right now what the used grid
resolution was, but surely not 256^3 !): no noticeable glitches were seen.
But since you must use some interpolation, the interpolated values around the
offending knots will be totally screwed!
To explain better: assume that thru one slice you have some regular CMYK 
progression,
but one or two knots have for example K and M swapped, because the searching 
algorithm
judges them colorimetrically correct. You can't notice them, but you will
notice all the interpolated points around them, which could never match.

I definitely see similarities here when analyzing slices of gradients converted
with Argyll profiles. For how much the algorithm you use is surely more 
raffinated
and complex than a bruteforce search, I suspect the problem is the same.
Note how I see differently the problem, not in gamut terms but analitically.

It may then be worth spending two more words on how I then decided to fix the 
problem.
I choose to get rid of K for how many problems a 4D space is prone to give: too 
many
combinations with the (almost) same visual result, and so the risk of the above
mentioned discontinuities. I limited the problem in the 3D space. The RGB->CMYK
conversion was performed algorithmically, in an idealized fashion, applying a 
rigid
mathematical GCR toward a well extabilished and unchangeable mathematically 
defined K
curve, taking just the total ink limit into account.
Then I "profiled" the RGB side (using the same bruteforce search method of 
above)
generating the RGBxRGB LUT. I was quite happy with the results for those times.

So I think that transforming the B2A generation process in a 3D problem could
really improve profile smoothness and regularity.

> While not perfect, the result you have seems reasonable. Whether it
> is satisfactory depends on the type of imagery you are reproducing. If
> the imagery does not often have transitions that go to the gamut surface in
> problematic areas, it is likely to be satisfactory.

In general I (and perhaps everybody I think) want a profile to be reliable on
the most different situations. Bumps/discontinuities will show up when printing
synthetic graphics with gradients rather than photos. But they can show up even
in real world photos, if you're unlucky enough to hit some troublesome slice 
(maybe
a yellow hat showing colored bands in the shadow areas, or distorted green 
shadows in
foliage, etc.)

/&

Other related posts: