Graeme,
I confess there's something 'voodoo' about using the GammaRamp with more
than 8 bit 'entry sizes'? Everywhere I look, I can only see references to
arrays of 256 entries.
In my humble research, so far, I managed to establish, with the help of the
GetDeviceCaps() API (gdi32.dll) that there is more than 24 bits per pixel
in my display buffer. The function gives me a value of 32 bits per pixel :
BITSPERPIXEL = 12
BPP = GetDeviceCaps(hSDC, BITSPERPIXEL)
Now, this is obviously > 24 ?
There was one video I watched where the person mentioned '30' bits per pixel
plus 2 extra bits for 'transparency'. I still have to put my hand on a
Microsoft blurb that documents this 'fact' but, if that is the case, then I
think I'm on the right track.
/ Roger
-----Original Message-----
From: argyllcms-bounce@xxxxxxxxxxxxx <argyllcms-bounce@xxxxxxxxxxxxx> On
Behalf Of Graeme Gill
Sent: December 28, 2019 6:09 PM
To: argyllcms@xxxxxxxxxxxxx
Subject: [argyllcms] Re: Programming the video card
graxx@xxxxxxxxxxxx wrote:
I upgraded yesterday my NVIDIA 1070 video card to benefit from 10-bitchanges have taken place?
performance. Aber, I need to determine, at the hardware level, what
depth.
The SetGammaRamp() call still returns 256 elements, regardless of bit