[argyllcms] Re: White Point

  • From: Ben Goren <ben@xxxxxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Mon, 9 Nov 2015 15:33:02 -0700

On Nov 9, 2015, at 12:25 PM, Elle Stone <ellestone@xxxxxxxxxxxxxxxxxxxx> wrote:

My understanding is that "delta E" is measured against spectral calculations,
the nature of which calculations are outside my small area of "maybe I
understand something". But the calculations have to do with how light
combines to produce the colors we see and measure, which isn't exactly the
way colors multiply in the various RGB color spaces we use for image editing.

I think you've got it, or at least on the right track.

The spectral calculations are actually much more straightforward and less scary
than they seem at first, especially considering that they use symbology most of
us don't get introduced to in the typical curricula of non-STEM majors.

Basically...the spectral calculations are doing the exact same thing as the
familiar RGB stuff is doing, but with much narrower slices of the spectrum.

Start with the following ingredients:

* Illuminant spectral power distribution
* Sample spectral reflectivity
* Observer spectral function, per-channel

The first two you're undoubtedly familiar with. Very easy to plot: at
such-and-such a wavelength, the illuminant emits this-and-that amount of light;
at such-and-such a wavelength, the sample reflects this-and-that percentage of
the light that falls on it.

Getting the resulting reflected spectrum is trivial: for each wavelength,
multiply the incident light from the illuminant by the percentage of
reflectivity from the sample, and you've got a new spectral plot. A white
sample (say, Spectralon) has a spectral plot identical (within error) to that
of the illuminant. An ideal gray card has the same shape, but all the values
are reduced to 18% of the original. Something that's blue will typically
reflect most of the light in the ~450 nm range and little else; something
yellow will typically reflect almost nothing below, say, 550 nm and almost
everything beyond it.

But, remember: it's a combination of illuminant and sample. If the illuminant
has very little light below 550 nm and lots of light above it, when you shine
it at a piece of Spectralon, the resulting spectrum is indistinguishable from
what you'd get shining broad-spectrum light on a yellow paint chip.

The observer spectral function is very much light the sample spectral
reflectivity, only it's a matter of how much light makes it past the observer's
filters to the detectors as opposed to how much gets reflected. And, with
trichromats, you've got three such functions.

So, what you do is you start with the spreadsheet here:


and do the exact same thing for each of the three channels in there that you
did once with the reflective sample. Now, you've got, per spectral slice, how
much of the light makes it through to each of the channels in the eye.

The last step is so simple it seems impossible. To get the _actual_ per-channel
signals...all you do is sum the per-channel spectral amounts. That's it! So, a
blue laser would have a large number for blue in one bin and small numbers for
green and red in the corresponding bin and zeros everywhere else; the final RGB
values are simply those three figures. A blue laser and a red laser shining on
the same sample would have that one bin plus, in another bin, a large figure
for red and a small one for green and almost zero for blue, so the RGB gets the
two reds plus the two greens plus the two blues.

Now...I'm starting to come to the realization that nearly all of our color woes
have to do with the fact that we're using oversimplified models of human color
vision. The history dates back to, famously, 1931, when a computer was an
human, usually a woman, who sat at a table with a slide rule or a mechanical
tabulator like the old-time mechanical cash registers. Computation, in other
words, was _very_ expensive and slow. And it's true that a "close enough" model
of human color vision can be achieved by pretending that it's simply three
linear / logarithmic monochromatic detectors. But it's also true that that's
only a rough approximation -- as is quickly evident by looking at the second
peak in the red function right underneath the blue function:


I'll bet a cup of coffee that what they're discovering in that paper at
colour-science.org is the degree to which different choices of primaries
diverge from the actual observer functions. And I'll bet another cup of coffee
that their method could be adapted to determining an optimal set of
primaries...and another cup (not on the same day!) that the optimal primaries
are going to be within shouting distance of the 450, 550, and 600 nm peaks of
the standard observer, with red and its funky double hump being the farthest
from the peak and blue being next farthest (to compensate for the red).

Now that we've got previously undreamt-of computational power at our ready
disposal, I think it's time to start considering moving beyond the simple RGB
(etc.) and other simplifications of approximations of human vision, and going
straight for a full-out modeling.

The Standard Observer is messy, yes. And it very much doesn't lend itself well
to basic operations...you can do a simple linear (or gamma-encoded) calculation
to cut the apparent brightness of an RGB-encoded color in half by simply
dividing the numbers by two; the equivalent computation for the Standard
Observer is far hairier. But somebody just has to write those routines once and
then it's the computer's problem to figure out what to do to the numbers.

That doesn't solve the problem that input devices have observer functions
significantly different from human ones. That, however, should be amenable to a
bit of engineering -- of tuning the spectral efficiency of the color filters in
the Bayer (or whatever) array to more closely match those of the Standard
Observer. And, in the mean time, the existing profiling mechanism is well
suited to mapping a camera's RGB values as least-worst as possible to the
Standard Observer...it's near-perfect for typical reflective surfaces, though
good luck photographing a spectrum.

...I should probably shut up, now....


Attachment: signature.asc
Description: Message signed with OpenPGP using GPGMail

Other related posts: