[dokuwiki] Speed

  • From: Ian Laurenson <hillview@xxxxxxxxxxxxxxx>
  • To: dokuwiki@xxxxxxxxxxxxx
  • Date: Tue, 03 May 2005 11:32:54 +1200

I have set-up a DokuWiki site for writing macros in OpenOffice.org.
The host of the site has just disabled the site as when Google's spider
searches the site it effectively creates a denial of service.

I had created a robots.txt file to exclude 6632 files (18.6MB) which
left about 700 files (4.8MB).

I have just changed the robots.txt file to exclude the entire site:

User-agent: *
Disallow: /

I would prefer it if the ~700 files could be indexed by google without
it bringing the host server to its knees. Any suggestions?

Thanks, Ian Laurenson

-- 
DokuWiki mailing list - more info at
http://wiki.splitbrain.org/wiki:mailinglist

Other related posts: