[argyllcms] Re: Programming the video card
- From: <graxx@xxxxxxxxxxxx>
- To: <argyllcms@xxxxxxxxxxxxx>
- Date: Sat, 28 Dec 2019 17:35:26 -0500
At some point, without “further information”, I am tempted to “believe” the
information returned by my “system”, drivers options and monitor settings.
According to the “Latest Studio driver” news and gossip read online, actually
in time for this year’s SIGGRAPH, NVIDIA “10-bit” display path for the GeForce
series applies to Photoshop as well, Mete :
https://www.nvidia.com/en-us/geforce/news/studio-driver/?ncid=so-twit-88958#cid=organicSocial_en-us_Twitter_NVIDIA_Studio
Towards the bottom of the page:
Multiple creative applications currently take advantage of 30-bit color
including
Adobe Photoshop, Adobe Premiere Pro, Autodesk RV, Colorfront Transkoder,
Assimilate Scratch, and Foundry Nuke.
Add to that the “fact” that I no banding on that test ramp image in Photoshop
with all the 10-bit options turned on everywhere? I don’t believe 10-bit is
only for Mac users…
But I have to demonstrate (prove it) to myself with “numbers”.
/ Roger
From: argyllcms-bounce@xxxxxxxxxxxxx <argyllcms-bounce@xxxxxxxxxxxxx> On Behalf
Of Knut Inge
Sent: December 28, 2019 5:27 PM
To: argyllcms@xxxxxxxxxxxxx
Subject: [argyllcms] Re: Programming the video card
https://www.engadget.com/2019/07/29/nvidia-studio-laptops-10-bit-photoshop/
« The latest Studio driver, due to be released shortly, will support 10-bit
color for all GPUs in Adobe Photoshop CC, Premier CC and other OpenGL-powered
apps. »
Only supported in OpenGl while «normal» framebuffers are out of luck?
It seems that the 10-bit thing has been promised and broken on and off again
for 10 years by Adobe, Microsoft and the Gpu vendors. I would not be surprised
if their «10-bit option» is actually 8-bits somewhere along the line...
Anyone using the workstation gpu hardware («Quadro»?) it seems that 10-bit
support on those is more dependable.
On Saturday, December 28, 2019, <graxx@xxxxxxxxxxxx <
mailto:graxx@xxxxxxxxxxxx>
wrote:
Thank you Mete.
I have a hard time distinguishing :
A) enabling “HDR10”on my Windows 10 installation;
B) the fact that my video card does NOT support HDR10;
C) the fact that GammaRamp is only 8-bit (in and out).
Yet, I have Photoshop 10-bit enabled (I suspect successfully) and I have 10-bit
selected in the NVIDIA ControlPanel.
I’m a little confused as to the exact path traveled by the pixels, out of
Photoshop, to the video card and to the monitor.
One thing is sure, if I make any changes to the GammaRamp in my system, I
immediately the effect on my monitor, turning lighter or darker.
I don’t have a problem explaining an 8-bit path, I understand how the pixels
can undergo ‘changes’ in the video LUT through the GammaRamp. But I am lost as
to what happens in a 10-bit mode.
/ Roger
From: argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx>
<argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx> > On
Behalf Of Mete Balci
Sent: December 28, 2019 2:36 PM
To: argyllcms@xxxxxxxxxxxxx <
mailto:argyllcms@xxxxxxxxxxxxx>
Subject: [argyllcms] Re: Programming the video card
Windows 10 supports 10bpc or 30 bit color, enabling the software/driver option
is enough for me (display is eizo cg) and there is nothing configurable on the
display menu etc. for this, it automatically use whatever is sent as long as it
is supported. For photoshop, you need to enable 30 bit display setting manually
I think.
Mete
On Sat, 28 Dec 2019 at 6:38 PM, <graxx@xxxxxxxxxxxx <
mailto:graxx@xxxxxxxxxxxx>
wrote:
I upgraded yesterday my NVIDIA 1070 video card to benefit from 10-bit
performance. Aber, I need to determine, at the hardware level, what changes
have taken place?
The SetGammaRamp() call still returns 256 elements, regardless of bit depth.
This documentation is intriguing to me, found on Microsoft :
Pointer to a buffer containing the gamma ramp to be set. The gamma ramp is
specified in three arrays of 256 WORD elements each, which contain the
mapping between RGB values in the frame buffer and digital-analog-
converter (DAC ) values. The RGB values must be stored in the most
significant bits of each WORD to increase DAC independence.
The last sentence is puzzling?
So I ran the function again and inspected the values encoded in each of the 256
Elements And got the following :
https://1drv.ms/u/s!AkD78CVR1NBqkod1y8wHvbzWpqIaaQ?e=a1nrmF
As you can see, the list is showing the contents of the 256 element array. On
the left hand side, circled in red, the contents is displayed in numeric form
and on the right, circled in green, the contents is displayed in hexadecimal
I purposely scrolled down to the value “128” because it is the middle of the
scale. Its numeric value is 32,768, in hexadecimal, this is 8000, shown as
“H8000” in the screen capture.
Now, coming back to this “significant bits” question… When I view the list in
decimals, I only see a sequential list of numbers but when I switch to
hexadecimal, I seem to see a “pattern” in the numbers: the last two digits are
always “00”? In binary, 32768 is 1000 0000 0000 0000.
So, in my case, the last 8 bits are always 0000 0000, regardless of the Element?
Meaning that only the first 8 bits are “significant” (actually change).
The last Element is (255) = &HFF00 or 1111 111 0000 0000 or 65280 but you see,
I’m still “stuck” with the last 8 bits as 0000 0000.
With a “10-bit” display option activated in the driver, I would have expected
to see a “change” at the hardware level, somewhere. This is “mystery meat” to
me…
I corresponded with NEC tech support and got this reply:
there are no settings in the monitor required to activate 10 bit
display. However, only the DisplayPort input on this model supports 10 bit
input. Also, your operating system would also need to support 10 bit in
addition to your GPU and display driver.
So I am using the latest version of NVIDIA display driver which supports 10-bit.
Does Windows 10 support 10-bit display? I am not entirely sure?
I do use a DisplayPort cable to connect my video card to my monitor.
And since, in principles, there are no settings in the monitor to activate
10-bit display, then, provided Windows 10 does support 10-bit display (but I
can’t “prove it to myself”), I should have 10-bit display in Photoshop.
I feel I am going in circle…
/ Roger
From: argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx>
<argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx> > On
Behalf Of Knut Inge
Sent: December 28, 2019 11:46 AM
To: argyllcms@xxxxxxxxxxxxx <
mailto:argyllcms@xxxxxxxxxxxxx>
Subject: [argyllcms] Re: Programming the video card
https://www.cnet.com/news/nvidia-studio-drivers-deliver-geforce-30-bit-color-unto-photoshop-and-more/
On Saturday, December 28, 2019, <graxx@xxxxxxxxxxxx <
mailto:graxx@xxxxxxxxxxxx>
wrote:
Florian,
Vielen danke!!!!
So, the API immer returns 256 elements, irrespective of 8-bit vs 10-bit
video LUT? Yet, I noticed the numeric values returned in each of the 256
elements to range from 0 to 65535, which sind 16 bit (nicht 8-bit). Do you
warum?
Bitte, allow me zwei fragen...
A) Für gamma calibration, es ist immer machen mit 8-bit?
B) Is there eine method to determine the "bit depth" of the video card?
MfG / Roger
-----Original Message-----
From: argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx>
<argyllcms-bounce@xxxxxxxxxxxxx <
mailto:argyllcms-bounce@xxxxxxxxxxxxx> > On
Behalf Of Florian Höch
Sent: December 27, 2019 5:19 PM
To: argyllcms@xxxxxxxxxxxxx <
mailto:argyllcms@xxxxxxxxxxxxx>
Subject: [argyllcms] Re: Programming the video card
Hi Roger,
Am 27.12.2019 um 18:05 schrieb graxx@xxxxxxxxxxxx <
mailto:graxx@xxxxxxxxxxxx> :
I had some old VisualBasic code to access the ‘Ramp’ inside my video
card. As you can see in the following screen capture, the ‘Length’ of
the RGB arrays is 256 elements :
as the Get/SetDeviceGammaRamp API is an abstraction provided by the
operating system, it's always returning/taking 3x256 16-bit elements,
irrespective of the underlying hardware's capabilities.
Florian.
Other related posts: