[argyllcms] Re: General calibration questions

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Mon, 17 Oct 2011 11:52:55 +1100

Andrés Vattuone wrote:
> As far as I can understand, the vcgt tag offers two alternatives, either
> storing gamma values for each channel (I don't know what the min and max
> gamma means)

It's a formula. The formula has three parameters. I did find the
formula somewhere in Apples mass of documentation somewhere -
it's in the icclib source code. Basically you apply the power
and then scale to the range min to max.

> or else store a function as a table for each channel,
> eventually the same for all. My very limited knowledege of programming
> language doesn't help me to grasp the bit depth of this table or how it is
> encoded.

It's right there in the documentation. Colorsync just uses the exact ICC
tags as data structures. So "UInt16" means unsigned 16 bit, etc.
You can easily look at the code in Argyll (icclib) or littlcms too.

> What does exactly Argyll do as regards calibration? I presume it reads
> either gama values or else 1Dlut curves and saves it on some type of memory.
> (The cards memory?)

I'm not sure what you mean. Argyll has tools to both create a calibration from
instrument measurements (dispcal), as well as apply the calibration to the
system using the video card per channel lookup tables (dispwin).

> I tried another loader and they gave the same results. I tested this with
> Argyll, reading a series of test patches.

> For some unknown reason I wasn't able to load the vcgt tag directly from
> profiles using Argyll yet. I have to check the documentation again. So I
> implemented the following workaround in Windows: I installed the profile as
> default, extracted with Argyll the calibration information to a calibration
> file and finally loaded the calibration settings with Argyll.

dispwin will directly load the vcgt tag into the graphic card Video LUTs.

> I am trying to prevent any unwanted calibration changes either by the
> monitor, the video card or the OS. Or at least be informed when such
> changes occur. 

That's quite difficult, and may be impossible in general. Typically there is no
good API for things like accessibility controls in the various Operating 
Systems,
and various hacks would be needed. To access the display controls you would
need to implement DDC communications. There is very low level support for this
on MSWin and OS X, and no direct support for it on Linux. There was a Linux 
project
for this (ddccontrol on SourceForge), but they struggled to accommodate the
different ways each display behaved. The main flaw from my perspective
is that there is no connection between the DDC channel and the graphics
card involved, so some (probably clumsy manual) means of establishing
this logical link would be needed. With DDC communications you may or may
not be able to check or set the display settings - I suspect that
it will depend on the particular display. Note that a few high end
monitors use USB instead of DDC for this, although they use the same DDC
protocol over USB.

> So I am try to learn all possible ways in which calibration can be altered.
> Video card luts are stored in the card's memory?

They are stored in the card somewhere, often in the hardware that does the
lookup of the video data. Note that many manufacturers have their own
private API's that allow additional changes - for instance my NVidia card
on MSWin has it's own private "Color Correction" settings where you can
set lookup curves, "Digital Vibrance", "Brightness", "Contrast", and it even
has it's own ICC vcgt loader too. I'm not sure how these interact with
the MSWin VideoLUT functions - they may well be in series. I doubt that
these NVidia settings have a documented API - you'd probably have to reverse
engineer it. I'm sure AMD have the equivalent for their cards too.

> The settings that the drivers presumably implement by manipulating their
> luts are stored in the luts themselves? Or perhaps elsewhere, some place the
> loaders can't access? What if they had 3d tables? Are the 1Dluts stored in
> the vcgt tag loaded there?

Sorry, I'm not sure quite what you mean. These things are typically layered
though. The bottom layer is what the application does, which may include
using ICC profiles (ie. Photoshop). The next layer is the O.S. supported API's
like VideoLUT loading, which is what Argyll dispwin uses. The O.S. video driver
implements this using the graphics card facilities. The next layer is the
graphics card private processing, such as the NVidia Color Correction settings.
The next layer is the display private calibration curves/matrices/3D tables (if 
any,
ie. EIZO, NEC, HP DreamColor etc.). The last layer is typically the display 
controls.
(I could imagine the last two interacting on some displays, if it's badly 
implemented.)

> If you have DDC compatible hardware are the physical controls on the monitor
> still independent from the video card?

Of course. How would they be dependent (ie. the display DDC controls are so 
varied,
that no graphics driver would dare to assume anything about how they work. So
the graphics card passes DDC through to the OS at a very low level - typically
bit twiddling level.)

Graeme Gill.

Other related posts: