[argyllcms] 10-bit/color systems

  • From: János, Tóth F. <janos666@xxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Wed, 1 Dec 2010 19:44:15 +0100

As I could take my hands on a used FirePro V5700 (for a low price) to use it
as 'testing toy', I would reboot my "forget about the 1D LUT calibration
with it's predefined targets and make only one, but high quality profile for
everything" story.
Feel free to jump through the /// parts if you used to read my previous
mails... :)

/// OFF
Now, I can confirm that the Dell U2410 (A00 hardware A02 firmware in my
case) could really work in 10 bit/color mode!
I feel it's important to say, because I couldn't find any reports about a
success. I found only questions with doubts and answers from expert
smart@sses that "Poor stupid man. Of course, it won't work! Do you think you
can have a 10-bit display under $2000? Stop asking stupid questions, will
you? Read some reviews instead!." - ~ Spyre's opinion in my words. :D)
/// OFF

/// A little summary about the idea: ///
With WCG displays (various, almost 'panel series unique' gamuts) and usual
(sRGB, PAL, AdobeRGB, etc gamuts), you have to remix almost every colors
(except the pure grays -when calibrated- but there is only 256 of them,
opposed to the available millions of colors ; or 1024 from the billions of
colors, etc).

So, why would you bother yourself with white point and gamma calibration if
it will make only a little part of the job while it operates with fixed,
predefined targets? (You theoretically need different LUTs for different
targets to achieve the optimal result with different standards, and you need
to choose them manually for every different jobs.)

One good answer would be: The VGA output is limited to 8 bit while you can
find displays with much better internal accuracy. -> Here comes the "Full
30-bit pipeline" solution. - And yes, there are some displays with 16 bit
internal processing, but not everybody is rich and the above mentioned
things will apply for them when this 16 bit internal LUT is only a 1D LUT.
And the best you can find is the 12-bit internal LUT, for big price tags...

In theory, a "curves+matrix" profile can take the full job, even if the
display is calibrated or not. And I don't like to dissipate the steps
between different elements of the pipeline where every steps have their
random inaccuracies...
And we didn't talk about the XYZ LUT profiles yet.
So, why not...?

/// End of the summary ///


After I fired up the FirePro, I tested the 10-bit pixel format support with
the test PS file from amd.com, and I acknowledged that it works. The test
gradient looks much smoother now.

The next thing I tested is it's effect on the color management quality. The
display was in native gamut mode and a calibration LUT was loaded into the
VGA LUT (D65, gamma 2.35 targets) while a simple "single gamma+matrix"
profile was assigned to the display.
I used an untagged 8-bit gray gradient image (so PS assumed the working
space as source profile ; I set it to sRGB) and I started to play. (It's
easy to "turn on and off" 10-bit because PS falls back to 8-bit when I drag
and move it's window and returns when I release it...)

The expected result: the white balance should be kept while the gradiation
should change.
The display has low gamma ~1.8, the calibration target was 2.35, and the CM
target was sRGB (~2.2). So, I expected some banding (that's why I chose this
combination...)

When I watched the full gradient (0->255) image, 10-bit showed a significant
improvement. In my subjective observation, the gradient (the gamma
corrected...) was much smoother with 10-bit output.
It's was expectable but it's not obvious that you can see significant
changes or you can barely notice anything...

But when I zoomed to the near-black scale, the superiority of the 10-bit
output wasn't so obvious.
- I could see some banding here, in 8-bit mode, but the white balance was
OK. The gradient had 'gray' tones. (As I expected it on a calibrated
display, where the calibration operates with 10-bit precision anyway).
- When PS was in 10-bit mode, the banding was much weaker BUT I could notice
some colorizations.
I didn't notice this colorization until I zoomed into the problematic
segment of the gradient but it looks like a heavy colorization.


So... What do you think about this colorization?

Did it come from the limitations of the 10-bit output?
I think it can be reduced if I get ride of this counter-manipulation (gamma
~ 1.8->2.35->2.2 ; but this was a synthetic test...).
And...

-> When I calibrated and profiled my display, the software used 8-bit values
only. The intermediate shades were never utilized.
So, my current theory that my best profile would be a matrix+curves profile,
where the curves would contain the measurements of the full 0->1024 primary
color gradients (+ primaries at 100 and 75% luminance for the matrix, but
that's all .. all: 3080 test patches :D:D:D).

But I am not sure because I heard it many times that XYZ LUT profiles are
much better (accurate). But they should be created from multidimensional
patches, so it would be an overkill (for the measurement process) to utilize
any of the "billion colors" over the "million colors".
And I know that ArgyllCMS creates 16-bit LUTs but I don't know how many
points of the "curves" are stored with how many digits.

So, what do you think? Which would be the best idea?
- Matrix + curves from [ R, G, B] = [0 , 1024] + W or
- XYZ LUT from the usual, numerous multidimensional patches (300~2000
points).


Yes, I know. The answer is very simple: I can't measure every 10-bit
gradient values because dispread doesn't support 30-bit output (yet ?  ;  or
does it?).


Or do I think too much about these color management engines (like PS ACE, MS
ICM, lcms2)? Can they really do an Absolute colorimetric calculation as I
imagine? Can they map any possible 24-bit color to it's closest possible
corrected 30-bit color? Or do they much simpler?

Other related posts: