John Weissberg wrote:
I have noticed a very large difference in the whitepoint actually measured (the 0 0 0 0 point in the .ti3 file) and the whitepoint stored in the 'wtpt' tag in a profile generated by Argyll.Measured whitepoint (from .ti3 file): 97.510 0.78000 -2.9600 Whitepoint from wtpt tag: 89.079722, 0.552168, -2.911983The difference seems huge to me. Can someone explain how it may have come about?
I think the data set is basically inconsistent. For instance, take a look at these three values: 1: 0.0000 0.0000 0.0000 0.0000 -> 97.510000 0.780000 -2.960000 3106: 1.0000 1.0000 0.0000 0.0000 -> 87.580000 0.970000 -4.000000 3097: 2.1800 2.2000 5.2000 0.2000 -> 93.070000 -0.510000 3.910000 So this is saying that adding an additional 1% Cyan, 1% Magenta, 5% Yellow and 0.2% black to 1% Cyan and 1% magenta makes the result 6 delta E lighter ! The data set is full of similar contradictions, hence the poor profile fit, no matter what resolution table is used. There's either an issue with device or instrument consistency, or the devices behaviour is highly non-monotonic. Graeme Gill.