[argyllcms] Re: nVidia GeForce & deep colour support in Linux

  • From: Marwan Daar <marwan.daar@xxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Sat, 30 Aug 2014 15:52:23 -0400

Not sure if this is helpful, relevant, or even meaningful, but my understanding is that regular geforce cards only support 10 bit in a directx environment, whereas you need the quadro for 10 bit open gl support (although it's disturbing that Chris is having issues despite his quadro card).


Marwan
Hello,

sorry for an off-topic question: does anybody have a working setup in
Linux like:

x nVidia GeForce & Display Port & native 10-bit depth LCD

x nVidia GeForce & Display Port & native 8-bit + FRC LCD panel [1]

[1] http://www.tftcentral.co.uk/reviews/dell_u2713h.htm
or  http://www.prad.de/new/monitore/test/2013/test-asus-pa279q.html


Based on earlier announcement [2], nvidia binary driver for Linux
supports 30-bit depth (RGB) in X for some time already. I succeed
turning it on and got some glitches / colour artefacts as described on
oyranos blog [3].

[2] http://www.nvidia.com/object/linux-display-ia32-295.20-driver.html
[3] http://www.oyranos.org/2014/05/image-editing-with-30-bit-monitors/


However, the 1024-step ramp remained displayed on my U2713H with 8-bit
gradation only (in Krita with OpenGL backend on, which is one of the few
apps supposedly supporting deep colour output).
Also dispcal with ColorMunki Photo reports 8-bit precision of video-card
LUT only.

I have access neither to Quadro card nor to native LCD to do more tests.

So, has anybody made GeForce to output Deep Colour in Linux or is it a
futile effort? (I have been googling and asking for a few weeks now w/o
success.)

Regards,
Milan



Other related posts: