Lars Tore Gustavsen wrote:
I found an old Seybold Report with the title "Measuring the quality of ICC profiles" http://www.heidelberg.com/wwwbinaries/bin/files/dotcom/en/prinect/prinect_profile_toolbox.pdf . On monitor profiles they have three objective test for the quality. The are, page 13, archived gamma, DE difference in whitepoint from target D50 and the last one average DE for a colorcheker measured on monitor.
Gamma is not as easy to verify as it may seem, due to the multitude of ways that the non-zero black point can be accounted for.
When I afterwards did a "profile -v -as Colorcheker" I get an avg DE on 5.6. I get some real big mismatch on a few of the patches (set profcheck output) There is DE of 60 on patch 16. I guess this yellow patch is out of gamut. I find it hard to understand why they have included out of gamut measurements.
Color checker is essentially a print based chart, and for subtractive processes yellow is a primary, so naturally it is often more saturated than the secondary yellow created by an additive device like a monitor.
I also get a quit big error on patch 20( its L*81) with a DE of about 18. I don't understand why since I get pretty nice numbers when I did the calibration with dispcal. I have attached a log for my calibration/profiling today.
There are lots of traps to fall into with such an exercise. You are essentially doing an absolute comparison, so the rendering needs to be setup to be absolute as well. Often the white points mismatch (reflective standard D50, emissive D65), and there is even poor agreement on how ICC emissive profiles should record the absolute behaviour of a display. So you'd need to stick to Argyll based CMM to eliminate the latter as a possible source of error. Graeme Gill.