On Fri, 27 Oct 2006 04:12:18 +0100 Chris Smith <chris@xxxxxxxxxxxxx> wrote: > liushk wrote: > > hi,all. > > > > i wanna use dokuwiki as an intranet website, > > the site will has about 100 users, about 1,000 hits per-day, about > > 100,000 pages > > > > is dokuwiki suitable? > Andi, Ben, *, > > How would the current caching structure stand up to this? > > With 100,000 pages, I reckon there would be in excess of 12k files > per cache directory. Do modern file systems handle that many files > in one directory without performance penalties or is there a need to > configure DW to use the first "n" characters of the cache name to > create a directory tree for the cache files? I guess it depends on the file system (I heard Riser is best for many small files). But modifying the cachename function could be helpful with so many pages. But I guess there might be other problems as well but until someone with so many pages tries it, we will never know ;-) Having a lot of RAM in the server would be a really good idea anyhow to have a big IO-Cache. Andi -- http://www.splitbrain.org -- DokuWiki mailing list - more info at http://wiki.splitbrain.org/wiki:mailinglist