Hi everyone! It looks like Axel has fixed Trac's speed issues. The site is fast and responsive, again. As it turns out, search engines caused all the traffic because they tried to index all commits and diffs between all revisions. As you can imagine, the number of combinations quickly sums up to many hundreds of millions of pages to index. So, the solution was to add a simple robots.txt file to prevent search engines from indexing the _whole_ Trac site. Jorge, are you satisfied, now? :P Bye, Waldemar Kornewald