Hey all, Just a heads up, should anyone else be working on the same thing, should we want this to be in the next release, and should anyone think I'm crazy. I plan to improve RSS feed performance as follows: (1) I will cache the RSS entries previously encountered so that the server generates the entry for a particular changelog line only once. I'll assume that the timestamp and the page ID together uniquely identify a changelog entry. If there are changes, I'll regenerate the feed from cached raw data. (2) If the changelog timestamp has not changed , I'll merely re-output the most recently generated feed as previously cached in XML. (3) I'll run independent virtual feeds uniquely identified by the HTTP query parameters provided to the feed, for purposes of at least XML caching. I'll try to share cached data across feeds, but I may only be able to do this for feeds whose query parameters select the same entries (not sure yet). Under this solution, no benefit is gained if there are a large number of changes between feed retrievals. If however the feed is retrieved on a regular basis, its performance can be kept reasonable. The required retrieval frequency depends on the rate at which the changelog grows. Unfortunately, unless I find a way to share raw data among all virtual feeds, you'd have to retrieve one virtual feed for each combination of entry-filtering parameters whose performance you cared about. Perhaps one way to guarantee good performance on average sites (for specified entry filters) is to just make sure the feed is registered with some RSS aggregator for daily or hourly refreshes. Stop me if I'm heading the wrong way. You can help by giving me pointers on using DokuWiki's page locking mechanisms, which is the one big unknown in my mind. ~joe -- DokuWiki mailing list - more info at http://wiki.splitbrain.org/wiki:mailinglist