Thu, 13 Oct 2011 17:54:45 +0400 Jens Heermann wrote: ... please be advised, that „wide gamut“ is not everything. For high-end-softproofing only monitors, that can be hardware-calibrated (direct access to the internal LUT of the monitor) can be recommended.... Therefore I would exclude those Dell and HP models, as well as all Apple Cinema Display. They may feature a wide gamut, but it will shrink significantly when these models will be calibrated via a “software-calibration”, which only offers 8bit resolution for grey steps, due to fact, that the corresponding curves will be stored in the graphics card and that the correction will also be done in the ICC profile. I should say that problem You mentioned arises only when the connection between monitor and PC is only 8bit-depth effective. Yes, You are right in that case. 8 bit is quite low for wide-gamut displays. This is because one 1/256 step on highly saturated channel shifts the color significantly. Calibration in video-adapter LUT leads to color banding of gray gradients. But there is several techniques to gain effective bit depth and decrease banding. One way is to change the 8-bit interface between PC and display on 10bit one. For example, DisplayPort. Several videoadapters have this port and many Pro-grade displays can be attached to DisplayPort by appropriate cable. In this case the difference between "hadware" calibration (inside of display) and "software" (in video-adapter LUT) is little. Another way is use 8-bit interconnect but with temporary (and, maybe spacial too) dsithering. Many of ATI graphic adapters have dithering on 8-bit output enabled by default. The dithering helps significantly decrease color banding on wide gamut displays while software calibration. Effective (measured) bit-depth with dithering can be as high as 9bit. The pastel color gradients with "software" calibration on interface with dithering may be even more smooth than gradients without "software" calibration and without dithering.