The output color depth option depends on the GPU and software
environment. I certainly don't have that option, but that could well be
because I'm using an analog monitor. I've also corresponded with an
Nvidia rep through their customer support channel, and they seem to be
clueless about this new feature.
Interesting, thanks for the link. I also didn't realize you can just set your output color depth in the Nvidia control panel! Are dispcal and dispread able to take advantage of that when they display test patches, or are they still limited to 8-bit for that?
On Thu, Aug 11, 2016 at 3:29 PM, Marwan Daar <marwan.daar@xxxxxxxxx <mailto:marwan.daar@xxxxxxxxx>> wrote:
My experimentation suggests that Windows desktop can use up to at
least 10 bits of the 16 bit values. (nvidia GPU and CRT). Whether
this is achieved via dithering or not is a separate question.
Also, while 10 bit precision can be achieved, it is still only in
the context of displaying 256 colors simultaneously.
Also see this:
IIRC Windows does support 10-bit color, but only in the
fullscreen exclusive mode of various hardware acceleration
APIs like Direct3D and OpenGL. I heard rumors they were
working on enabling 10-bit color for the desktop, but I have
no idea whether that's true or what the timeline for such a
change would be if it is.
One thing I have wondered is whether anything actually uses
more than 8 bits of the 16-bit RAMDAC values that Windows
uses. I was thinking of modifying dispcal locally to set the
LUT directly instead of modifying the color of the test
patches, to see if that gives it access to more than 8 bits on
my 10-bit monitor - but that would be a pretty major change
(potentially making the whole process more direct), and I
haven't even checked yet whether dispcal can easily access the
LUT setting functions of dispwin.