sorry, I meant 256 values per channel.
There is also evidence that GPUs do some form of dithering to achieve higher than 8 bit precision. I believe AMD even has an option for this.
On Thu, 11 Aug 2016 09:29:53 -0400
Marwan Daar <marwan.daar@xxxxxxxxx> wrote:
My experimentation suggests that WindowsThere must be some mixup here. 256 color limits
desktop can use up to at least 10 bits of the
16 bit values. (nvidia GPU and CRT). Whether
this is achieved via dithering or not is a
separate question. Also, while 10 bit
precision can be achieved, it is still only in
the context of displaying 256 colors
are many years back into history. All modern
monitors claim to be able to display 16 million
Neither the graphics card nor the operating
system are involved in this. From their point
of view, there are actually 8 and 10 bits