[argyllcms] Re: Question regarding gamut mapping for photographic images

  • From: Graeme Gill <graeme@xxxxxxxxxxxxx>
  • To: argyllcms@xxxxxxxxxxxxx
  • Date: Fri, 22 Jul 2005 22:07:14 +1000

Greg Sullivan wrote:

Referring to this extract from a post to newsgroup sci.engr.color by
tlianza, in thread "Chromacities of digital photo-exposer":

"The final reproduction gamma of a reproduced image from an sRGB image through a Frontier printer should have a relative gamma of 1.2 to 1.6 and the extremes of sRGB should be mapped to the extremes of print material. Anything less than that will yeild an image that appears flat and lifeless when viewed at interior lighting conditions."

URL for the complete message: http://groups-beta.google.com/group/sci.engr.color/msg/cd99c3af72b00a6b?hl=e
n&

It's really a mater of philosophy. If you want the images to be as colorful as possible, then by all means expand the source gamut out if it's smaller than the destination. For some purposes, this may not be desirable though, since at an extreme (converting a small gamut space into a very large one), the images may begin to look a bit silly.

Can I achieve this when I create a printer profile with Argyll? Is this, in
fact, simply what a perceptual intent normally does?

You can achieve something like this if you use the "saturation" intent (use profile -t5 to use the saturation intent for the perceptual table, or profile -T5 to use the non-saturation enhancing saturation intent for the saturation intent profile table.) It will distort the hues slightly more than "perceptual", but will expand the source gamut outwards, not just compress it inwards, which is what perceptual does. This is all assuming you are using profile to target a specific source gamut (e.g. sRGB), or using icclink in "non-dumb" mode.

p.s Currently waiting for my sRGB perceptual tables to compute, for a "high"
quality profile. It's very, very, VERY slow! ;^)

Depends on the speed of your machine, and possibly amount of memory. "high" quality is high resolution, and is usually rather slow. It's the price paid for high precision and flexibility in computing the reverse profile lookups. (I'm guessing it's spending most time in creating the A2B table ?) This is one of the applications where even a 100GHz machine wouldn't be wasted, if such a thing existed.

"medium" quality is often good enough for many uses, and is noticeably
quicker to compute.

Graeme Gill.



Other related posts: