[argyllcms] Re: Argyll and 30-bit colors

  • From: Marwan Daar <marwan.daar@xxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Thu, 11 Aug 2016 09:29:53 -0400

My experimentation suggests that Windows desktop can use up to at least 10 bits of the 16 bit values. (nvidia GPU and CRT). Whether this is achieved via dithering or not is a separate question. Also, while 10 bit precision can be achieved, it is still only in the context of displaying 256 colors simultaneously.

Also see this: https://hardforum.com/threads/nvidia-or-amd-for-color-accuracy-and-calibration-features.1873362/page-4#post-1042471971

IIRC Windows does support 10-bit color, but only in the fullscreen exclusive mode of various hardware acceleration APIs like Direct3D and OpenGL. I heard rumors they were working on enabling 10-bit color for the desktop, but I have no idea whether that's true or what the timeline for such a change would be if it is.

One thing I have wondered is whether anything actually uses more than 8 bits of the 16-bit RAMDAC values that Windows uses. I was thinking of modifying dispcal locally to set the LUT directly instead of modifying the color of the test patches, to see if that gives it access to more than 8 bits on my 10-bit monitor - but that would be a pretty major change (potentially making the whole process more direct), and I haven't even checked yet whether dispcal can easily access the LUT setting functions of dispwin.


Other related posts: