I'm trying to find out how good a rgb camera performs as an colorimeter for
prints. So i took images of testcharts (IT8.7-4) under certain constant
I measured every color patch of the chart with a spectrophotometer and got the
Now i used the scanin command with the image ("image1"), the .cht-file and the
.cie-file with the measured L*a*b-values.
I created a cLUT based profile ("profile1") with the following options: colprof
-v -qh -bn -al -u profile1
Now i checked how good the created profile perfoms with profcheck ("profcheck
-v2 -s image1.ti3 profile1.icm")
and got errors (CIE94) up to 10 (average error is 1.6). I found out that the
biggest errors are located in the area of low L (<25) and between -30 and +30
regarding a and b.
Now my naive first expectation was that the error would be near 0, because i
used a look-up-table based profile and calculated the error based on the same
.ti3-file used for the profile generation.
Probably i don't get the concept of the look-up-tables. I thought if I have 604
color patches und i know the corresponding L*a*b-values there should be no
problem to map this from RGB to L*a*b.
So why does the transformation perform so badly in the mentioned areas?
I thought maybe because of a bad signal noise ratio imaging dark patches and in
addition to this maybe the lighting plays a certain rule? I used really bright
white phosphor led, so the spectrum drops at 480nm.
I'm a student and quite new and unexperienced to this field. I just want to
make sure i didn't do something wrong with the usage of ArgyllCMS. Also i'm
happy for every input i can get.
Thank you in advance!