[argyllcms] Re: Argyll and 30-bit colors
- From: Graeme Gill <graeme@xxxxxxxxxxxxx>
- To: argyllcms@xxxxxxxxxxxxx
- Date: Tue, 16 Aug 2016 13:13:30 +1000
Troy Sobotka wrote:> On Thu, Aug 11, 2016, 12:49 AM Graeme Gill
Here is a PDF from nVidia which contains some
code snips for Windows and Linux:
From the contents of these, it seems that neither MS nor
Apple have current native operating systems support for 30 bit
frame buffers, both relying on OpenGL rendering
to access it. This is a considerable obstacle
for non-OpenGL based applications, so it
makes it less likely that many applications will
make use of it.
In contrast, X11/Linux should work out of the box using
existing API's, if the applications look for it.
Overall I'm a bit unconvinced by it all.
Calibration is already > 8 bpc, and profiling
doesn't sample in enough detail for > 8 bpc to
be of much significance (i.e. what does it
matter that you are restricted to 16 million colors out
of 1 Billon, if you are only measuring a few
thousand of them ?). The resulting calibration
curves are floating point, and the profiles are 16 bpc, so
if your system has > 8 bpc, the end result will be smooth.
In terms of dispcal, there are two directions
I'd like to go in, neither of which relies on
30 bit frame buffers. One is to use dithering (which
is tricky to do automatically - it's hard to quickly
and reliably figure out the actual bit depth from VideoLUT
to display using an instrument), and the other direction
is to change dispcal's algorithm to not be so sensitive
While I would expect that OpenGL also works
with cards of other manufacturers, I realise
that including OpenGL support opens a pandora
box for developers and users.
I do have some OpenGL code from a few years back that
I could base something on - it was the only way of getting
smooth crossfades between images on an MSWin machine :-
GDI and MS's other API's were a joke, but years-old OpenGL
had a stable API and just worked.
So I suppose an OpenGL test window option is a possibility.
Profiling my monitor, I get failed patches
around the black level, and I hoped that using
10-bits could help.
Dithering does help, as can be experimented with using madTPG.
I am a pure Linux user, but I have seen enough
evidence that 30-bit colors are possible in
Windows. Sending 10-bit colors to display the
patches a colorimeter or spectrometer should
measure may make a difference.
I'm not so sure. The basic colorimetry isn't that
repeatable per measurement, unless you are using expensive
instruments (Klein K10 etc.), and there is little
disadvantage in choosing profile test points that land
on 8 bit values, with the resulting profile is
16 bpc, so it can be used at whatever precision
the display is capable of.
Similarly, if an
image has 100% intensity of a color channel at
255 or 1023 seems to be a choice of the
application (as long as the hardware is able to
Yes - the application needs to be 30 bit aware/capable
to get the benefits of 30 bit frame buffer and to
exploit the smoothness of 16 bpc ICC profiles.
But the profiler doesn't need access to 10 bpp to
characterize the display. It will interpolate between
8 bpp values, just like it interpolates between
the relatively few samples measured.
Other related posts: