Hi all,I didn't remember doing something weird with the lexer. Anyways, the lexer is always called with that function p_get_instructions($text). In Jaume's case, $text is always filled by rawWiki($id). But I guess, the stinky thing is for each link found in the main document p_get_instructions(rawWiki($id)) is called with the link id.
I believe that I didn't clean up the instructions results set, properly. Few month ago a ask for that : //www.freelists.org/archives/dokuwiki/09-2006/msg00299.html Harry, what do you think about that ? Cordialement, Danjer Harry Fuecks wrote:
From a brief scan of the code (found at http://danjer.doudouke.org/tech/dokutexit) in particular class.texitrender.php, it invoking the parser directly and _may_ have passed something to the lexer which depends on the input document. I'd discuss this with the author... On 5/22/07, Jaume Obrador <obrador@xxxxxxxxxxxx> wrote:The only plugin I have installed and it's used when this error occurs is dokutexit, which parses dokuwiki syntax into LaTeX to generate a PDF file. Thnks. Jaume Obrador. El dl 21 de 05 del 2007 a les 23:47 +0200, en/na Harry Fuecks va escriure: > On 5/21/07, Jaume Obrador <obrador@xxxxxxxxxxxx> wrote:> > Hi, I try to use dokuwiki together with dokutexit plugin to generate a> > PDF file from some pages. I may say that it works great with a few> > pages, but the problem comes when I try to generate a PDF from a larger > > number of pages, total Kb of those pages together are 209K. I get the> > following error 22 times: > >> > Warning: preg_match() [function.preg-match]: Compilation failed:> > regular expression too large at offset 0> > in /var/www/lledoner/dokuwiki/inc/parser/lexer.php on line 115> >> > I increased the default php memory_limit from 8M to 32M, with no good> > results. > > That's a limitation of PCRE not PHP - see http://www.pcre.org/pcre.txt > section "LIMITATIONS": "The maximum length of a compiled pattern is > 65539 (sic) bytes if PCRE > is compiled with the default internal linkage size of 2." > > It goes on to say you can compile PCRE differently to get round this > but that will probably get interesting with PHP, where PCRE is part of > the core distribution. > > Without knowing more I'd say this problem _isn't_ directly caused by > trying to generate a PDF from many pages, because the size of the > parser regex should not be dependent on the size of the document(s) > you are parsing. > > To have this error possible causes might be; > > - a very large smileys or acronyms file - last time I checked these > get turned into a single regular expression so a big input file could > lead to a regex > 65539 bytes. Normally though if the problem was > here, you'd see it on every page, not just the PDF version > > - a plugin which _IS_ using the document to build further regular > expressions, in some way: list your plugins... > > Hope that helps -- DokuWiki mailing list - more info at http://wiki.splitbrain.org/wiki:mailinglist
-- DokuWiki mailing list - more info at http://wiki.splitbrain.org/wiki:mailinglist