This has recently been brought to my attention. Evidently, some monitors may be able to change their LUTs even though it is not documented. Does anyone know links to such discussions anywhere? So the advantage of doing anything in the display would be that it is probably 10, 12 or 14 bits with direct access to the panel with a minimum of quantization in-between. Compared to the (usually) 8-bit quantization that happens when crossing the DVI link. But any LUT (especially 3D LUT) can only decrease the gamut, right? The reason would be to calibrate the behaviour to some standard response. If you can characterize the GPU buffer -> visible light, it seems that in most cases one would prefer to give the software color management component full freedom and full access to the capabilities of your display to do the optimal mapping from image file to visible light? Known exceptions: -limited by 8 bits it makes sense to distribute them nonlineary (aka gamma). You want to spend most of them close to black. -For day-to-day use it would make a lot of sense to upload a "perfect" sRGB correction curve into the display, meaning that non-color-managed applications would look as good as possible. Ideally, when loading e.g. photoshop you would want it to upload a new/swap display LUT into something more "native" -k