Knut Inge wrote:
I have this nice camera that measure photons quite lineary over a large dynamic range (3000:1 or so) presented in a raw file. Exposure can even be shifted to cover different ranges with better SNR, and spatial samples can be averaged to improve SNR with regards to random noise. Can I use it to estimate (at least) the luminance-gamma at the bottom of the scale and/or estimate the error of my calibrator in this respect?
In principle you can. It sounds like a big, expensive project through. First you would have to characterise the cameras spectral sensitivities - that takes a monochrometer to do well, or a lot of fancy software and a ColorCheckerDC/SG to do approximately. Then you have to spectrally characterize the display (needing a spectrometer), and compute a camera RGB to CIE XYZ calibration matrix. Then you have to write software to control the camera and process the resulting RAW file. Then you have to interface it to the calibration software. In all, it is probably cheaper and faster to buy a better instrument, such as the spectrometer you would need to spectrally characterise the display. [People have tried some hacky ways of using a camera as a colorimeter, and even used it on displays - Argyll will let you do that - but the results are never very good due to the spectral issues, and it won't work with Argyll's display calibration code, because the calibration code is interactive - unlike profiling it creates test values on the fly.] Graeme Gill.