I stumbled on an article on DICOM, here :
http://umich.edu/~ners580/ners-bioe_481/lectures/pdfs/2008-06-JCI_Fetterly-g
rayscaleLCD.pdf
It's quite impressive, technically. Especially the part regarding
"allocating the available JNDs" to the 8-bit 256 "drive levels". The way I
understand it, the idea is to make each of the 256 levels result in a
"constant" JND.
I'm not about to embark into DICOM-calibrating my monitor but I though it
"confirmed" some of my own intuition on the subject. So, instead of using a
JND, why not use a unit of L* value? So, if my monitor is set for, say, 100
cd/m2 at maximum, and, assume perfect 0 at RGB = 0,0,0 (to simplify
discussions), then the "task" of calibration becomes one of dividing the
available Lightness range (100 units) over the 8-bit available "calibration
slots", such that 100 divided by 256 is equal to 0.39 L* level? (I'm not
trying to make things more complex than they need to be - believe me) 0.39
is an odd value (I still have to wrap my head around the implication) but
what if the monitor Max Luminance would be set to 256 cd/m2? Then, the task
of calibration, using that approach, would simply be one of making sure that
I there is exactly *one* Delta L increment between each point of the 8-bit
video LUT/ramp in the video card?
I hate to think out loud like that.
/ Roger