The lack of underexposure tolerance is to be expected from
the method used to determine ANSI/ISO speeds. This method is
distantly related to the original Kodak Speed method. The idea
was to find the _minimum_ exposure needed for "an excellent
print". Meaning one with adequate shadow detail. What was found
in the original research was that film had maybe a one stop
underexposure margin but a many stop over exposure margin.
Exposing for several stops beyond that given by the speed
measurement had little or no effect on print quality. The
minimum exposure was justified because thin negatives have the
best sharpness and minimum grain. This is of far less importance
for modern film than the stuff from the 1930s the tests were made
on. When Kodak speeds were adopted by the ASA in the mid 1940s a
one stop fudge factor was added, i.e. the film was actually twice
as fast as the published speed. That was dropped in the late
1950s when the ASA changed from the original Kodak speed method
to a modification of the DIN method which was much easier to
measure and came out with identical speeds with a fixed
correction factor. This is still the method currently used for
B&W still negative film. Note that motion picture negative and
color films use a different method. Optimum exposure for B&W
still film is probably about twice the box speed.
On 6/22/2018 3:03 PM, Tim Daneliuk wrote:
On 06/22/2018 04:38 PM, `Richard Knoppow wrote:
Thoughts:
1) One thing I discovered in developing the correction tables for the
temperature controlled timer I was working on is that - at least for
conventional film/dev combos - the correction for time and temp is
pretty consistent across them as a percentage matter. When I compared
formula driven corrections against table data provided by Kodak and Ilford,
the formula was pretty close, at least between 60F and 80F which is what
the timer currently targets. "Close" means less than 10% or so variation
from published values - essentially negligible (unless you are a BTZS
fiddler who likes doing sensiometric testing more than taking pictures :)
2) In my own experience, treating conventional films of a given ASA
as being the "same" for purposes of figuring out the nominal development
time is pretty close. For example, I didn't have data for Arista 100
in PMK. So I used my Agfapan in PMK values to get started because both
are nominally ASA 100 films (and both test at a "real" ASA of about 50 on
a densitometer when processed in "normal" developers like HC-110). This was
essentially right on, or close enough anyway.
3) So, for everything except tabular films, if I get an alien film for which
I do not have data, I just use another film of similar ASA to get going,
and the same time/temp curve for everything.
I used to be meticulous about testing every new film/developer combo I
tried with the densitometer. But ... it seems to just not be all that
necessary unless I get very strange or unexpected results. Modern films
have huge latitude. The usual sin committed is underexposing them. But
if you hold shadow detail and don't let the highlights blow out, you can
easily correct minor development variability in the printing process. At this
point, I pretty much just use these rules of thumb for everything and it
works remarkably well:
a) Expose film at half rated ASA
b) Find nominal 68F time for that film, or other similar ASA rated film.
c) Underdevelop 20%
d) For each -+N adjustment, decrease/increase development time 20%.
Perfect? No, probably not. But I get consistent and easy to print negs
this way.