[argyllcms] Re: Delta-E - the wrong value to optimize?

  • From: Florian Höch <lists+argyllcms@xxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Thu, 24 Jun 2010 19:08:08 +0200


which parameters did you choose for calibration and did you check the uncalibrated response (native whitepoint, blackpoint and gamma can be had with dispcal -R for example)? Did you just calibrate, or also use dispread/colprof to create a profile? (a profile can also introduce artifacts of its own when used, also without calibration)

All in all it really depends if 8 bits is enough :)

In an 8 bit data path, some calibration artifacts like banding will always occur. How pronounced/visible these are greatly depends on the native characteristics of a display, and how far away the chosen calibration parameters are. Which error minimizing function is used to approach those target parameters during calibration shouldn't play a big role here afaik. If you have enough bits available, visible artifacts could be avoided altogether, but today most digitally driven displays are limited to 8 bits on the input side (and computer systems on the output side respectively). One solution are displays which store the calibration with higher bit depth, typically >=10 bits, in hardare of the display itself instead of the video card's gamma table (not supported by Argyll atm), while the digital connection is still 8 bits. Another is driving a display over analogue VGA with a graphics card which supports >8 bits via its RAMDAC (this won't increase the bit depth of material of course, but can be used to apply calibration curves without introducing visible artifacts). Yet another solution would be increasing the bit depth of the data path itself, but hard- and software then also needs to support that (video card, the digital connection used, OS, drivers, applications), and I think it's not something that is possible with today's OS and applications yet, even though there are specifications and implementations for connectors and protocols (HDMI, DisplayPort come to mind).

Am 24.06.2010 16:51, schrieb Jon Zeeff:
I have a plasma monitor that looks much worse when calibrated - bad
posterization in the brighter reds. Is there anything I can do? Perhaps
DeltaE isn't the right function to minimize and a new function that
corresponds better to how a human will perceive the overall look is
needed - ie, smoothness is critical. Any suggestions on modifying the
code (feasibility, where to start, etc)? Any tips on why this is
occurring? Is 8 bits/color just not enough?

Florian Höch

Other related posts: