I've been working on an assembly of 454 data that contains a mix of shotgun and paired-end reads. The paired end reads have been empirically mapped and have a calculated distance of 3000 bp with a standard deviation of 800 bp. All that information gets put into the traceinfo.xml file in the usual fashion, but the distance range that MIRA indicates its actually using is quite a bit larger than I expect. Bastien previously posted (//www.freelists.org/post/mira_talk/TF-and-TT-in-MAF-format,3) that he has built in a '"lenience" factor' that allows for 20 or 30% deviation. Was the actual value pinned down? For the experiment that I mentioned, the *out.caf file indicates MIRA has given the paired reads the allowable minimum distance of 600 bp and maximum of 5400 bp. That still gives a mean of 3000 but with a deviation of 2400 bp. I also looked at a previous experiment where the pairs were defined as having a mean of 7200 bp and a standard deviation of 1800 bp. MIRA output indicates that it is using a range of 1800 bp to 12600 bp. That would indicate a mean of 7200 bp and a deviation of 5400 bp. In both cases it appears that MIRA is not using a percentage factor, but is actually using three standard deviations. Does that sound correct Bastien? I'd like to be able to reign in that value, and can probably do it by indicating an artificially smaller standard deviation in the input traceinfo.xml file, but does anyone know if that "lenience factor" can be modified directly? Regards, Darrell ========================================== Darrell O. Bayles, M.S., Ph.D. USDA, ARS, National Animal Disease Center Infectious Bacterial Diseases Research Unit 1920 Dayton Ave, Bldg 24 P.O. Box 70 Ames, IA 50010 Tel: (515) 337-7165 Fax: (515) 337-7002 ==========================================