> > > So it still has a bug. Looking at the source code, the problem is in > MSWin's XF86VidModeSetGamma() function. xcalib should avoid this > function and set the lookup values itself. > > Of course it's possible that XF86VidModeSetGamma() actually knows the > hardware > precision, while SetDeviceGammaRamp() caching the 16 bit values. > > Great! So, again, from the layman's perspective, this means that Argyll's default curves are correct, while the Windows-set default curves *might* be wrong, I guess. By the way, do you think this can be tested by checking both cal files with dispcal -r? It should show the apparent number of significant bits, right? Also, there may be no difference between Argyll-set and system-set default gamma in Linux, and this would prove that the Windows-set curves *are*, in fact, wrong. From the practical perspective, can you please confirm that those who profile their monitors in Windows without prior calibration, should reset their gamma with Argyll first?