János, Tóth F. wrote:
- Is the ColorMunki (in Adaptive HiRes mode) + ArgyllCMS (1.3.2) capable to calibrate (and/or profile) displays with NeoPDP panels?
Given that the ColorMunki is a spectrometer, the likely answer is yes. What reason would there be for it not to work ?
My biggest concern was that this TV doesn't show the usual 8-bit gradient image. I saw a "noisy s##t" from a closer distance and a smooth gradient from a bigger (normal viewing) distance (with less noise). But I was unable to distinguish the gray shades from this distance. It felt like some built-in de-banding (or blur, or something or other, but nothing good...).
The basic problem with Plasma is that it has limited linear level control. The initial panels had only 16 linear levels. Used without dithering this would give horrible contouring/posterization, so they pulled every trick in the book to cover it up - spatial and dynamic dithering, hence the noise you notice. The dynamic dithering also creates artefacts such as dynamic contouring, where a moving image cancels the dynamic dithering. Modern panels have increased the number of levels, but the basic problems seem to remain, as you have noticed.
another problem here... I don't know. But it's not fair to say that this panel has "6145 own gray shades". It couldn't reproduce a proper 8-bit gradient, even with it's most friendly OSD settings. (Not to talk about the other color modes...)
It would seem that marketing has lost touch with reality, if this is the case.
I measured lower contrast ratio (with higher black level) than I usually measure on my LCD (with it's initial settings before the WP adjustment). It was ~800:1 but the perceived contrast ratio was very good. The black looked much more deeper than the other LCD showed on the same room (which measured around 950:1 before the calibration and 700:1 after).
This could be an angle of view thing. The instrument is measuring at 90 degrees, while LCD's tend to have a worse looking black at other viewing angles. A phosphor based screen like a plasma will tend to look more uniform from other viewing angles.
And a very strange thing: The report said that the VGA LUT precision is 10 bit/color. How the hell??? The noise pattern on the display is so dummy that the sensor can see only one smaller square of the bigger noise pattern. No way that it can see the effect of the bigger noise pattern. This was the point when I started to worry about the dithering of the VGA card.
The instrument integration time is reasonably long so as to improve sensor precision, and also average out any refresh frequencies. It could be that there really is 10 bit over a long enough period. It's hard to know what to make of your experience, since the characteristics of the display would have to be examined in some detail to ascertain why it doesn't calibrate well. Graeme Gill.