ternaryd wrote:
Yes it should. A depth of 32 bits in X was
allowed since a long time, and it's up to the
driver what to do with the spare bits.
Those lines in
"dispcal -v2" with "Failed", don't they vary
the colors hoping to get a reading which
matches some target values within some
tolerance?
If it's hard to figure out, why not ask the
user?
Maybe you could even allow him to set
some environment variable such that this
setting would apply to all Argyll tools. I
think if someone doesn't know he's using 30-bit
colors, his bank account will certainly know.
BTW, dispcal told me that it was unable to find
the bit-width of the Video LUT;
a bit later I
read that the official software for the Eizo
Monitors (running Windows only) won't allow to
adjust the Video LUT either. Might there be
some connection?
What would be the effect of this upon the
results? Smoother but less precise curves?
I haven't figured out yet what madTPG is, but
if dithering helps, why don't 10-bits?
My ColorMunki Photo certainly not being a high
quality spectrometer, it should be already
better than my previous Spyder4. But I'm
afraid, I don't follow you here: If the
instrument is too imprecise to give reliable
(repeatable) measurements, how would dithering
help then? Would it be possible (useful?) to
run some test cycle to measure the tolerance of
the instrument?