[argyllcms] Re: Argyll and 30-bit colors
- From: ternaryd <ternaryd@xxxxxxxxx>
- To: argyllcms@xxxxxxxxxxxxx
- Date: Wed, 17 Aug 2016 08:02:59 +0200
On Wed, 17 Aug 2016 15:08:31 +1000
Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:
Oh - OK - that's the routine that tries to
measure the actual video depth.
If that was of the monitor, I knew the answer
(it's 16 bits, but I don't know the size. 256
Without taking huge amounts of time, it
will fail if the measurements are not
consistent enough to draw a conclusion.
This typically comes down to the resolution
and stability of the instrument, as well as
the stability of the display.
The Eizo monitors have a thing called DUE which
claims to be specially stable, and I tend to
confirm this visually. There is a website where
the author made lab grade tests and confirmed
the stability. So, as it seems, ColorMunki
Photo from X-Rite is rather unreliable. Right?
"We don't have access to the VideoLUT."
Hmm. That's bad. You won't get very far with
calibration with that situation. Are you
using the > V1.9.0 Beta dispwin ?
No. I'm using the official debian packets,
which in this variant (sid) is 1.8.3.
Does this mean, that the video LUT is in the
video card rather than in the monitor?
USB connection to the display is likely to
be more reliable, but again none of that is
documented, and nothing is standard.
On Eizo's English as well as the German website
I found suggestions that there is support for
Linux, but when it comes to download, there
isn't, and Eizo hasn't answered my questions.
In the meanwhile the prices for the monitors
have dropped by 25% (a difference of €500) and
I really want to get the most out of what I've
paid for. So I decided to buy my first Windows
in more than 30 years of computing. Eizo should
get payed from Microsoft for forcing me to do
that, because I will use it only to calibrate
the monitor, and then leave it alone.
The Color Navigator also uses an USB
connection. If you know of a Windows USB
sniffer and are able to do something with such
a dump, I would be willing to install that and
perform any operation you suggest. Maybe
reverse engineering this would protect others
from spending a few hundreds of bucks just to
calibrate a monitor, and maybe this could be
extrapolated to some other cases.
BTW. What is a good way to determine that I am
actually seeing 10 bpc on screen? Is there a
recommendable test image? Is there a way to
measure that with my spectrometer? The nVidia
driver (and X) are claiming it's 10bpc, but I
suspect still to see only 8.
Other related posts: