[argyllcms] Re: Measuring and converting colors to sRGB

  • From: Gerhard Fuernkranz <nospam456@xxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Sun, 12 Feb 2012 22:26:24 +0100

Am 12.02.2012 14:50, schrieb Pascal de Bruijn:
Hi,

A friend of mine had an old DEC VT420 (connected to a VAX) lying
about... It's an amber/orange Terminal, so I'd though it to be fun to
measure which particular amber hue it emitted.

So I took a reading with (using a ColorMunki):

$ spotread -v -s -d -y l

I've attached the resulting .sp file, specplot tells me this about it:

$ specplot dec_vt420_amber.sp
Type = fa = 1, File 'dec_vt420_amber.sp'
XYZ = 1.231303 1.000000 0.023474, x,y = 0.546086 0.443503
D50 L*a*b* = 100.000000 42.461990 138.939711
CMYV density = 0.061589 0.730825 1.795940 0.299188
CCT = 2025.343496, VCT = 1332.653724
CRI = 29.1 (Invalid)
CIEDE2000 Delta E = 33.572109

But when I put the XYZ values through xicclu, I get a huge mount of
clipping, which resultings in a hue shift to the yellow:

$ xicclu -f b -s 255 srgb.icc
1.231303 1.000000 0.023474
1.231303 1.000000 0.023474 [XYZ] ->  MatrixBwd ->  255.000000 219.479747
0.000000 [RGB] (clip)

So the big question is, did I use the wrong mode for measuring the
terminal? Or am I just misusing xicclu?

Usually, the only color you can reproduce on a color display at 100% luminance 
is white. If the given chromaticity is reproducible at all on this display, 
then certainly only at a lower luminance level. Furthermore I guess that your 
desire may rather be to reproduce the color with absolute colorimetric intent?

Best Regards
Gerhard


Other related posts: