[argyllcms] Re: Profile input white not mapping to output white

  • From: Ben Goren <ben@xxxxxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Wed, 28 Nov 2012 18:34:05 -0700

On 2012-11-28, at 3:59 PM, Graeme Gill wrote:

>> That's what the RAW software already does, but Argyll is unaware that it's 
>> happened and
>> basically attempts to re-apply that same process. Furthermore, RAW software 
>> is designed
> 
> The processes are in the wrong order. Convert to color managed space,
> then select white point. Anything else is fairly crazy (how can you evaluate
> the image in an unknown device specific space ?). The RAW converters need
> some re-jigging to be color managed. Clipping colors above the white point
> in a chromatically consistent manner is something that should be done
> right at the point the white point is set, not some time afterwards.
> 
> [ie. it seems to me that you are trying to patch up some fundamental
> issues with the workflow, and this is not a good long term strategy.
> Better to fix the workflow, if possible.]

I have no arguments whatsoever with any of that -- quite the contrary.

Sadly...I lack the skills to fix things, and I lack the time to acquire the 
skills.

Fortunately, I've got a kludge that produces excellent results to tide me over 
until somebody *does* fix things.

>> to align the channels such that everything that's spectrally flat should 
>> when it's done
>> already have equal RGB values, at least for whatever's lit by the
>> (impossible-to-guess-after-RAW-development) illuminant the RAW software used 
>> for the
>> scene.
> 
> I'm not sure I follow this. Why is a spectrally flat sample of much
> importance ?

Um...because the people who first developed digital camera raw processing 
weren't color scientists, because they figured that a spectrally flat sample 
was a good way to determine the color temperature of the illuminant, and 
because no color scientists have come along since to fix things?

In practice, one typically sets the white balance in one of the following ways:

* letting the camera and / or computer worry about it;
* telling the software to use one of a small number of pre-defined presets, 
such as ``Tungsten'' or ``Cloudy'' or ``fluorescent'';
* telling the software to use a specific color temperature combined with a 
specific red / green balance (the latter with an arbitrary positive / negative 
numeric scale) (and it's up to the user to figure out what those numbers should 
be);
* or clicking on a sample in the picture that's supposed to lie on the neutral 
axis.

The latter originally and for quite some time was a true point sample. 
Recently, it's expanded in a couple applications to, for example, a 7x7 pixel 
sample. Ilia's Raw Photo Processor is one of the few that let you define an 
arbitrary rectangle to use. It also lets you specify the actual channel 
multipliers (in EV equivalent) to use.

And you would be truly astounded at what people use for white balance targets. 
Personally, I generally use a piece of Tyvek, which is *almost* as good 
(spectrally) as PTFE, though it tends to a bit of shininess and glare. When I'm 
doing general-purpose photography and I don't have a reliable neutral target to 
sample, I generally crank the saturation as high as it'll go, fiddle with the 
knobs until the image is as natural-looking as I can get it, and then return 
the saturation back to normal.

But it's common practice, and even common advice from people who should really 
know better, to use a piece of office copy paper, or a garment, or a lens 
cleaning cloth, or a waaaaaay overpriced accessory that's not even as 
spectrally flat as what you can get in a paint from the hardware store (if you 
know who to ask for the formula).

As I understand it, the raw processors then generally apply a simple linear 
transformation to the channel values such that your chosen sample has equal RGB 
values.

Does that step all over all sorts of color theory? I have no doubt. But it's 
what even the best raw processors do.

> I'm sure it helps your workflow. The problem is that it's just one workflow, 
> and I
> think it will severely compromise the usefulness of the profiles to cater for 
> just that
> workflow.

Oh, I completely agree -- this should *not* be the default behavior. As I hope 
I've made clear from the beginning:

> Now, maybe it could be added as an option,

Yes, please!

> but I'm very dubious about
> mapping 1,1,1 to D50 in the raw data - it would make more sense in your 
> workflow
> to map any colors over the media white point to D50 (assuming the media white
> point is the white point you want).

I'm not quite sure I follow you -- but that might be because you might be 
working off another misunderstanding of what raw processors do.

That is, just as they do the white balancing (in a crude manner), they also do 
all the clipping of values over the sensor's saturation point.

In the particular case of the scene I've been photographing, the lightest patch 
on the chart is a bit of Teflon thread tape, and it (as you would expect) 
measures at something around L=99 a=0 b=0. But there's plenty of stuff in the 
scene (not on the chart) that's brighter than that, especially including the 
background, but also including a fair amount of the sawdust. Raw processors 
also generally have ``highlight recovery'' features to bring back detail that 
is otherwise lost based upon the other development parameters. I'm not enough 
of a mathematician to explain what it is that they do, but they do happen to 
often do a surprisingly good job at it. It's all done by fiddling with sliders 
until you get as much of the lost detail back in the image as you can or 
desire, whichever limit you hit first. And, obviously, this is again like 
making sausage, something that can't be undone.

Could a real color scientist accomplish the same (basic) end result but in a 
much superior way? I have no doubt. But, once again, we're stuck with this 
being the way that one gets images from a modern DSLR.

>> So...on the one hand, the ``right'' answer would be for Argyll to do its 
>> magic on the
>> RAW camera data, but that would mean you'd have to get into the business of 
>> Bayer
>> demosaicing and all the rest -- not something I think you're interested in. 
>> (But if you
>> are, I'll *really* get excited!)
> 
> It's on my list of things I'd like to look into (especially if it seems that 
> many of
> the RAW converters aren't properly color managed), but I can't see that 
> happening
> any time soon :-( :-(

In all honesty, you may well become the biggest hero of the photographic world 
if you were to develop a raw processing engine to the same standards as you've 
developed Argyll. But I can completely understand how an undertaking of that 
magnitude wouldn't be something at the top of your list right now.

>> That's why I still think the least-worst answer would to be to add a switch 
>> to colprof
>> that says, ``the input is from a typical modern DSLR that's been through RAW 
>> processing
>> and thus is already white balanced and clipped, so assume that the endpoints 
>> of the
>> gray axis are already where they should be.''
> 
> I'll look into a "clip to white point" option. It may do what you want.

Thank you! I very much appreciate it, and would be more than happy to supply 
you with files to test or help in other ways. For example, all my recent work 
has been done with studio flashes that are already pretty close to D50. But 
what would things be like under other light sources? I'm going to do some 
experimentation along those lines, myself, but maybe not for a few more weeks 
(unless it's something that would help you).

And, may I suggest? Whatever needs to be done at the white point probably also 
needs to be done at the black point, for all the same reasons. The shortcomings 
there aren't anywhere near as obvious (for obvious reasons), but I'm very sure 
they're there as well.

Cheers,

b&

Other related posts: