[dokuwiki] Re: Performance and caching

  • From: "Joe Lapp" <joe.lapp@xxxxxxxxx>
  • To: dokuwiki@xxxxxxxxxxxxx
  • Date: Thu, 08 Sep 2005 08:11:17 -0500 (CDT)

Hi Esther,

> I think we need a way [...]

I really don't think we "need" to keep pages from going stale.  It's a 
nice-to-have, but it truly isn't necessary.  It's a small gain, and the only 
real benefactor is the person who just created or deleted a page and wants to 
see that the links have changed color.  Others will rarely notice.

We may *need* to provide that user with some feedback indicating the success of 
the creation or deletion by changing the link color on the page from which the 
user traversed to the new or deleted page, but that's easy to do by just 
expiring the cache for the page given in the HTTP "referer" header.

And if some random user happens to traverse a green link to a deleted page or a 
red link to a created page during a staleness window, we can have DokuWiki then 
expire the referring page just so that the user sees his/her new discovery 
subsequently reflected on the referring page.  But this may require passing an 
additional HTTP query parameter in red links, so that DokuWiki knows what color 
the referring link was.  Or we just post a FAQ item making users aware of the 
staleness period.

Power users might even be given the option of appending a "?do=refresh" to 
force a refresh on a page that they're impatient to see updated.  But a power 
user is likely to be logged in, in which case, if we're custom-tailoring 
logged-in the pages and not drawing them from cache, the pages are fresh anyway.

> [...] instead of invalidating the  
> cache for the whole wiki.

Right, we'd never automatically do that.  On a heavily used server, this is 
something that the user might do manually and only at off-peak times.

> We could have a metafile $id.backlink in  
> which all pages are listed which link to or retrieve data from this  
> page (like the include plugin). The parser (and certain plugins)  
> would extract all dependecies on other pages, compare it to the  
> previous version (similar as the full text indexer does for words)  
> and write the backlink metafile. Now when a page gets created,  
> changed or deleted the cache of all the pages in the backlink file  
> will be deleted and hence the page is refreshed on the next load.

It's good to think about what it would take to keep pages from going stale, so 
that if it's an easy and sensible task, we can go ahead and do it.  I just want 
to decouple the caching solution from the always-fresh solution.  Let's keep in 
mind that the former doesn't require the latter.

Does anybody know what caching solution Mediawiki is using?

~joe
--
DokuWiki mailing list - more info at
http://wiki.splitbrain.org/wiki:mailinglist

Other related posts: