[argyllcms] Graphics card

  • From: Nikolay Pokhilchenko <nikolay_po@xxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Sun, 27 Feb 2011 12:20:19 +0300

As far as I know, ATI cards are performing the dithering when go from the high 
bit depth to lower. For example, ATI do the spacial and temporary dithering 
from 10-bit linearization curves to 8bit DVI/HDMI. The Nvidia usually not.
So, often it's impossible to use the display calibration on Nvidia because 
there is 8 bit interface (DVI/HDMI) between the 2D LUT and the interface. The 
banding is occurs.
Users had to change the graphic adapters from Nvidia to ATI because ATI's 
dithering eliminates banding.
If adapter and display have DisplayPort interface both, There is no problem 
with Nvidia (I suppose) because of 10-bit depth of DisplayPort interface. In 
this case the dithering is done in the display by it's own processor.

Sat, 26 Feb 2011 17:20:22 -0500 письмо от adam k <aak1946@xxxxxxxxx>:

> Are ATI graphic cards as good as Nvidia?
> Thanks!
> Sent from iPhone

Other related posts: