As I said in my previous message, Ken's result is peculiar. I think he may be referring to processing each word individually and calling the library for each one. John On Fri, Oct 25, 2013 at 07:12:34PM -0600, Susan Jolly wrote: > Thanks for all the helpful replies. Ken thanks for the timing results. I > agree with you that it doesn't make sense if it takes liblouis 15 minutes > rather than a few seconds to translate a file with around 100,000 words. > However, it's hard to guess what the problem is without using a > sophisticated profiler. > > >From what I've read and also in my own experience the biggest leverage for > performance comes from using optimal algorithms. Interpreters, compilers, > operating systems and hardware are getting faster all the time and, of > course, computer memory continues to get larger and cheaper. For this > reason, I think a discussion of algorithms is essential when planning for > the future. > > A very simple example where a choice of algorithms is possible occurs in > braille translators that look up words in a provided list of translations > and, if the word isn't found, translate it using a translation table. If > using a list is faster that would suggest that a modification that adds > each newly translated word to the list would be faster than retranslating > it each time is encountered. > > Susan > > > For a description of the software, to download it and links to > project pages go to http://www.abilitiessoft.com -- John J. Boyer; President, Chief Software Developer Abilitiessoft, Inc. http://www.abilitiessoft.com Madison, Wisconsin USA Developing software for people with disabilities For a description of the software, to download it and links to project pages go to http://www.abilitiessoft.com