> do the averaging in-memory before writing the final averaged values to disk? > Unless one is interested in determining > the behavior of the instrument, I don't see the advantage of hanging on to > the intermediate values. > I avoided this approach only because sometimes one gets outliers that should NOT be averaged (rather one should rescan the line with the chartread -r command and THEN average with the other readings once corrected). Currently that means separate files, then verify to confirm no outliers (a very important step), then average. If that issue is addressed then I agree in-memory averaging would be better. Elena's idea seems good. Chartread would need a user settable dE tolerance, at the end of the N scans the lines would be checked and rescan requested until the lines meet the dE tolerance (or user can give up at any time). This actually greatly simplifies the process of using verify and manually identifying outliers and manually rescanning if outliers found.