[dokuwiki] Re: rate limiting

  • From: Nate Kohl <nate@xxxxxxxxxxxxxxxx>
  • To: dokuwiki@xxxxxxxxxxxxx
  • Date: Tue, 29 Dec 2009 10:13:53 -0500

>> Does anybody know if there is an option in DokuWiki to perform rate
>> limiting based on IP addresses?  For example, can DokuWiki start
>> adding incrementally larger pauses for a given IP if it requests more
>> than X pages in Y seconds?
>>
>> Whenever people (not Google, just random users) spider my site, CPU
>> usage gets rather high, and it's happening frequently enough that my
>> hosting provider has noticed.  We might be able to do some rate
>> limiting at the firewall, but I thought it would be worth finding out
>> if DokuWiki already has that functionality built in.
>>
>
> A better option maybe to look at your cache setting.  DokuWiki can serve 
> pages with very little effort.  High resource usage comes if it has to 
> re-render or completely regenerate a page.   If your wiki (or a high 
> proportion of pages in your wiki) see little activity, each access will 
> require at least a re-render. The caching system should be good enough for 
> you to set the config value for cachetime to one month (60*60*24*30) or even 
> higher.
>
> If the spiders pay attention to your robots.txt file you could ease things by 
> setting a higher 'Crawl-delay' value.

Thanks Chris -- that's a good idea.  I'll up page caching to a month
and see what effect it has on CPU usage.
   nate-
--
DokuWiki mailing list - more info at
http://www.dokuwiki.org/mailinglist

Other related posts: