>> Anyway correcting in two places in theory degrades more, no matter how >> many bit the LUTs are... > > And in general terms, I would agree with your assessment. But not > necessarily this time: if you calibrate based solely on software > correction, especially if the deviation from the target is big, your > results WILL definatelly be worse. Because if you correct in hardware > first, 10 bits result in 1024 steps, compared to the 256 available steps > at 8 bit. So, you have more room to operate in without touching the last > 8 bit your graphics card will send to your display. > > I think this is definatelly preferable. Wouldn't applying a LUT shrink the gamut of the display? In that case first you use the monitor calibration which shrinks the gamut and then you use the shrunk gamut to create a profile/LUT that further shrinks the gamut. Excuse me if I'm not making sense, because I'm fairly new to colour management. / KJ