See this link
https://kb.portrait.com/help/visual-color-comparison
But again, unless you are comparing different types of displays it’s not really
necessary
On Jun 14, 2021, at 7:55 PM, Peter Folk <peter.folk01@xxxxxxxxx> wrote:
Neil,
what do you mean by "optical comparator" ?
Peter
On Mon, Jun 14, 2021 at 4:32 PM Neil Woodall <vidsurfr@xxxxxxxxx> wrote:
For a display at a reference level brightness and viewing environment, think
dark but not black room. It probably doesn’t matter anymore. If you are
using LCD, then black has enough leakage for a spectrometer to measure and
if it’s OLED, then it’s black, so why even bother. By the time you get to a
level of 16/255, the output is high enough (~0.2 nits) for an accurate
spectrometer measurement and since almost all 3D LUT color corrections are
17x17x17, that’s going to be good enough. The only difference is the amount
of time. So if you are measuring 50+ points, then you probably want to
profile and then use the colorimeter, it will be faster. What does matter is
that if you have a wide gamut display and have both a colorimeter and
spectrometer you need to create a profile for the colorimeter based on the
spectrometer readings. This is because the colorimeter filters are never an
exact match for the CIE 1931 standard XYZ response and the narrower the
spectrum of the display the bigger the impact any mismatches will have.
My company does a lot of calibration work for displays. In the lab we use a
spectrometer, but when we set up production for a customer it is usually a
couple of colorimeters and they profile them periodically to make sure all
of them are responding the same.
As long as the wavelength step in the spectrometer is smaller than the full
width half magnitude resolution of the optics in the spectrometer, almost
anything will be good enough. Unless you are measuring a laser based
display, then you probably have to step up to a much higher resolution
device than an i1Pro class.
The advantage of saving the spectrum is that you can go back and change what
color reference you want to use. This is mainly for people doing research.
All video standards and calibrations are based on the CIE 1931 XYZ response
curves.
Finally, wide gamut displays have a narrow enough spectrum that different
individuals see gray differently if they are comparing it to a reference.
For example, if you had a calibrated CRT and a calibrated OLED most
individuals will see a different color of gray on the two devices. This is
because your individual cone response is different than the CIE 1931 XYZ
spectrum and is likely different than the person sitting next to you and
different than it was when you were a kid (transmission spectrum of the
ocular medium, cones, cornea, and lens changes with age…UV damage). You
don’t notice because your eye adapts to the white point because you need to
eliminate the impact on the illumination color to see reflective objects
correctly. Therefore as long as it’s close and you are not comparing the
display simultaneously to another display or object, then the fact that the
colorimeter doesn’t have the exactly CIE 1931 XYZ spectrum probably isn’t
that important. BTW, the importance of the white point accuracy was
confirmed to me by a top Hollywood colorist.
For more than 99% of the people important thing is that the all the colors
from gray to highly saturated colors are correct relative to each other and
according to the color standard (sRGB, P3, etc…) of the input, which you can
do with a colorimeter. Think of it this way, the error between the 1931 CIE
XYZ spectrum and the XYZ filters in the colorimeter is probably smaller than
the error between 1931 CIE XYZ spectrum and the XYZ response of your eye. If
it does make a difference, then you need to get an optical comparator so
that you can match the gray of the narrow spectrum display (that everyone
sees a little differently) to gray based on a wide spectrum light source
(which everyone sees the same). Then you measure the white point for the
gray after using the optical comparator and create a custom color space
using that measured Wxy instead of the D65 Wxy coordinate.
On Jun 14, 2021, at 1:19 PM, Ben Goren <ben@xxxxxxxxxxxxxxxx> wrote:
On Jun 14, 2021, at 12:31 PM, <graxx@xxxxxxxxxxxx> <graxx@xxxxxxxxxxxx>
wrote:
Sorry to bring back the age-old question, what kind of “stimulus” colors
exactly benefit from measuring with a (filter-based) colorimeter such as
i1Display (old DTP94, Spyders…) rather than with a (holographic-based)
spectrometer instrument? Extremely saturated colors, such as the “pure
primaries” or the “low Luminance” colors such as below 5 Cd/m2?
Aaron gave you a good response, but didn’t directly address your question
about saturation.
The math-y reply I just sent points at the answer, but also doesn’t
directly answer it.
So long as your display only has three primaries, it doesn’t matter how
saturated they are. Mathematically, it’s a three-dimensional space. And the
colorimeter is also a three-dimensional space.
The one space might be tall, skinny, and slanted compared with the other.
No problem. The CCMX measures the shapes of both and comes up with a
formula to squish / stretch / whatever from the one to the other.
So long as those transformations aren’t excessive — and they aren’t with
real-world displays and colorimeters — the math all “just works.”
And, again: the colorimeters are much better than the spectrometers in low
luminance settings. Since that’s where the most real-world problems occur,
that’s how colorimeters in a well-formed workflow can match and often
significantly improve upon spectrometers.
Cheers,
b&