[argyllcms] Re: Calibrate to BT.2100 PQ curve?
- From: Niklas Haas <argyll@xxxxxxxxx>
- To: Graeme Gill <graeme@xxxxxxxxxxxxx>
- Date: Wed, 28 Feb 2018 17:11:03 +0100
On Wed, 28 Feb 2018 12:23:23 +1100, Graeme Gill <graeme@xxxxxxxxxxxxx> wrote:
Yes and no. If they are accessible, ArgyllCMS uses the per channel
VideoLUT hardware extensively to access the full output bit depth to
the display by generating test sample specific curves. Typically
dispcal is used to create display calibration curves that set white
point and linearize the display response, and then of course the
calibration curves will be loaded in the HW for subsequent
profiling. If calibration is not desired, then either the existing
VideoLUT curves can be left in place, or the VideoLUT is bypassed
(by loading test sample specific curves) to profile the display
to the highest available precision.
Interesting, I did not know that.
Lack of subtlety is a fast way of killing HDR off.
Amen. And on top of the lack of subtelty, lack of consideration for
viewing preferences and the viewers' eyes ability to adjust to dynamic
Sure, dynamic eye adaptation might work for a 55" screen sitting a meter
in front of you, but home users don't do that. They normally sit 2-3m
from the display, on a sofa. And at this range, for typical 4K displays,
dynamic eye adjustment doesn't work since the screen doesn't cover a
sufficient area of the retina. So the bright scenes, rather than looking
like realistic outdoor scenes, end up just hurting the viewer's eyes.
I imagine eye fatigue will be the biggest cause of HDR's decline, unless
mastering engineers figure out how to master in HDR, and fast.
But do they have to be perfectly separated ?
Flare and glare near a highlight should conceal loss of
black level near highlights to a large degree.
That's actually a very good point. Even with a 1000:1 contrast, if
you can overdrive the backlight by a factor of 10, you would be able to
display the HDR material correctly by just making the signal darker to
compensate for the increased backlight brightness.
Essentially, changing the backlight scale would either clip bright
highlights (low backlight brightness) or clip dark spots (high backlight
brightness). With this in mind, a FALD implementation would actually be
able to faithfully represent HDR as long as the scene doesn't contain
both bright and dark spots within the same "backlighting" cells.
I must add this to the list of things I'm planning to test as soon as I
can actually drive the display in HDR mode using my computer. I'll also
most likely write a summary of what things need changing and submit them
to dell in the hopes that a bored engineer may accidentally stumble upon
Now if only we had an open-source firmware where we could fix these
things ourselves, add adaptive sync, reduce the input latency, and make
it the best display on the market...
Right, but isn't that merely the issue that the GUI isn't color
managed ? If it was, then all normal (non HDR) content would
remain at 100 cd/m^2 (or whatever the setting is in translating
sRGB into HDR) ?
I'm referring to the display mode of operation where the FALD is
enabled, but the display is *not* in PQ mode. The FALD can be enabled
separately (oddly enough by setting the device into "Game" or "Movie" mode),
in which case it does use the FALD to increase dynamic contrast, but it
decides to do so by also making the white regions way too bright, much
brighter than 100 cd/m^2. If the brightness controls worked as intended,
then this would not be an issue - and the only net effect of the FALD
would be to make the dark regions darker, thus simulating a high
contrast without murdering my retinas.
But now I'm curious about what happens when the display is in PQ mode.
In theory, with a color managed input, in PQ mode, and using
well-mastered sources (or a well-configured tone mapping algorithm),
nothing except highlights should exceed the 100 cd/m^2 mark - so it
would indeed never end up displaying too bright on the display.
Wow - that sounds like a terrible implementation. The demo I saw
some years ago of a Brightside display seemed to use instantaneous
back-light adjustments, and was relatively seamless. I would
have expected the technology to have improved since then, not
got worse! (Maybe DELL are simply incompetent in their
Doesn't match what I know of/saw in the Brightside implementation.
I think I need to re-evaluate the display in PQ mode before calling it,
it's just that the only device I have that's currently capable of
generating a PQ signal also seems to be wholly incompatible with the
display. (Resulting in an awful, unusable image)
And even though it supports hardware calibration curves, I can't
calibrate it in PQ mode to eliminate the error..
It's pretty well understood in the Video calibrator world - Plasma, OLED
and HDR are all known to have area brightness limiters.
Ah, forgive my ignorance in this domain; I've only ever worked with LCD
calibration. I did not know e.g. plasma displays suffered from the same
Other related posts: