G,$. Lllo Original Message From: Milan Knížek Sent: Monday, September 1, 2014 3:52 PM To: argyllcms@xxxxxxxxxxxxx Reply To: argyllcms@xxxxxxxxxxxxx Subject: [argyllcms] Re: nVidia GeForce & deep colour support in Linux Kai-Uwe Behrmann píše v Ne 31. 08. 2014 v 22:00 +0200: > > On 30. August 2014 22:29:55 MESZ, "János, Tóth F." <janos666@xxxxxxxxxx> > wrote: > > > >2014-08-30 21:52 GMT+02:00 Marwan Daar <marwan.daar@xxxxxxxxx>: > >> Not sure if this is helpful, relevant, or even meaningful, but my > >> understanding is that regular geforce cards only support 10 bit in a > >directx > >> environment, whereas you need the quadro for 10 bit open gl support > I have tried to get some info from devtalk Linux forums of nVidia. (59 views, no response yet.) The README mentions few bits about the requirements: This driver release supports X screens with screen depths of 30 bits per pixel (10 bits per color component). This provides about 1 billion possible colors, allowing for higher color precision and smoother gradients. When displaying a depth 30 image, the color data may be dithered to lower bit depths, depending on the capabilities of the display device and how it is connected to the GPU. Some devices connected via analog VGA or DisplayPort can display the full 10 bit range of colors. Devices connected via DVI or HDMI, as well as laptop internal panels connected via LVDS, will be dithered to 8 or 6 bits per pixel. To work reliably, depth 30 requires X.Org 7.3 or higher and pixman 0.11.6 or higher. In addition to the above software requirements, many X applications and toolkits do not understand depth 30 visuals as of this writing. Some programs may work correctly, some may work but display incorrect colors, and some may simply fail to run. In particular, many OpenGL applications request 8 bits of alpha when searching for FBConfigs. Since depth 30 visuals have only 2 bits of alpha, no suitable FBConfigs will be found and such applications will fail to start. > Opengl works fine with 30-bit visuals under xorg, be it Krita, Compiz, > KWin or ICC Examin. xwininfo -root | grep Depth Shows me Depth: 30. > That means any application, which is capable to encode 10-bit bit per > plane can display 30-bit. It is merely a question to find a toolkit in > support of that. Using xlib directly should enable applications for > high bit depth, which is mostly abstracted away by linux graphics API's I can run X server also with 30-bit depth (X.org log confirms it) with 32 bpp framebuffer (so only 2 bits are for transparency). I have found the same info in Xorg.log posted on the internet forums by some Quadro card user. I also checked with Dell U2713H support community forum that the LCD actually has 10-bit input and works with MS Windows + Quadro / Firepro + Photoshop. > these days. The more easy way is using OpenGL. > Is there anything special for Krita to display in 30-bit depth other than to choose "Enable OpenGL" in Preferences / Display and load a 16-bit encoded image? I used Compiz as WM. regards, Milan