Recently I played with DispcalGUI to calibrate and profile a standard iMac 22" display for prepress with my Eye-One Pro. Starting from the corresponding preset and with some slight change I have been able to obtain what appears to be very good result (see attached verify results).
Just keep in mind that such profile verification almost always show good results (if nothing went wrong during the profiling), especially if the verification is run directly after a display was calibrated and profiled. dispcalGUI's default 'verify.ti1' (and also the 'extended' version) has a relatively low patch count which consists of colors that are mostly part of the default profiling testcharts, thus explaining the low deviations (but you can of course use any other chart to stress the verification with some more colors not used during profiling, and there is the possibility to use XYZ or Lab reference files like the Fogra media wedge subsets available from the Fogra website, or one of the various ColorChecker reference files floating around).
to better understand the differences between the Dispcal verification method and the one resumed in the screenshot.
First let me say I don't know internals of UDACT, but the way it tests colors and gray balance is explained in the user's guide, so you may already know stuff I'm writing here. As I understand it, the gray balance 'range' check uses a combined delta a/delta b absolute deviation (e.g. max delta a = -0.5 and max delta b = 0.7 for graybalance should result in a range of 1.2). Because measurements in the extreme darks can be problematic, it only takes into account gray patches with a minimal luminance of 1% as noted in the report (ie. if white luminance = 120cd/m2, then only patches with at least 1.2cd/m2 will be taken into account). I would assume that UDACT looks up corresponding Lab target values for fixed R=G=B numbers through the profile (this is what dispcalGUI does if verifying via a TI1 file), but like Graeme said, as the actual target values are not shown, this is only a guess. dispcalGUI currently uses a simpler approach for gray balance results (just average and max delta C). This will change in the next release, where I've included an a/b range check somewhat similar to the likes of UDACT, and I've also added amount of tone values in the curve viewer (I'm not sure how this is tested in UDACT though, so comments from others are welcome as always).
Is it really impossible to achieve better results in terms of "tone values" working only with the VCGT transfer curves?
To increase the amount of tone values after calibration, in addition to Graeme's suggestions, another thing that may help is altering the whitepoint target - the iMac display afaik has no whitepoint adjustment, so if at all possible you probably want to stay close to the native whitepoint to minimize tone values lost due to the calibration curves trying to achieve another whitepoint. This might not be optimal if the native whitepoint is far from your lighting conditions though. Choosing a gamma that is closer to the native response of the display could also help. You can approach that by using 'verify uncalibrated display' from the 'tools' menu (which equals dispcal -R), to get an approximated gamma reading as starting point. You can then look at the calibration curves, the closer they are to the idealized diagonal line, the less the video card gamma table needed to be altered, and the fewer tone values should be lost. Of course, a certain amount of values will always be lost, even if using native whitepoint, for achieving gamma/graybalance.
Regards(btw, I'd be interested in the 'profile quality' part of the UDACT report for comparison with the dispcalGUI verification results, if you're willing to share :) )
-- Florian Höch http://hoech.net