Hello all, I'm writing some code for display LUT programming and I have a question, if I may. On the one hand, the 8-bit video card "gammaramp" memory holds 256 discrete entries for all three RGB output, yet, the individual values stored in those 256 entries are 16-bit values. So, effectively, is the calibration done in 16 bit or 8 bit? Are monitors actually able to render 65,536 discrete "levels"? Like, if I store the value 65280 in, say, LUT(256), or the value 65220 or 65315, is the video board actually capable of creating a signal matching these subtly different drive levels? Please forgive my hardware ignorance. Best / Roger