[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Performance profiling of makeinfo

From: Patrice Dumas
Subject: Re: Performance profiling of makeinfo
Date: Fri, 26 Dec 2014 21:27:05 +0100
User-agent: Mutt/1.5.20 (2009-12-10)

On Fri, Dec 26, 2014 at 05:33:20PM +0000, Gavin Smith wrote:
> On Fri, Dec 26, 2014 at 9:27 AM, Patrice Dumas <address@hidden> wrote:
> sub parser(;$$)
> # spent 63.3s (3.74+59.5) within Texinfo::Parser::parser which was
> called 4679 times, avg 13.5ms/call: # 4678 times (3.74s+59.5s) by
> Texinfo::Report::gdt at line 401 of
> The gdt function uses a parser, which I believe is used for
> translation of strings.

Indeed, that's the reason.  Oddly in my tests there wasn't that much use
of gdt, but I can imagine that in some case gdt could be called a lot.
There is a potential of good acceleration here.  In general the parser
is reused, but in gdt, there is comment explaining why it is not a good
  # Don't reuse the current parser itself, as (tested) the parsing goes 
  # wrong, certainly because the parsed text can affect the parser
  # state.
So a secondary parser is used, which means that all what is done
regarding indices/nodes/sections... is forgotten when going out of gdt,
and in any case in gdt there should be no such Texinfo code, it is only
for strings to be translated.  So, I think that a better course of
action would be to have a way to define a parser setting only a subset 
of parser options, and defining differently some keys, instead of
copying simply passing references as they are not supposed to be
modified in the code in gdt.

I could propose something in the next days.

Also, I suppose we could use dclone or something else in any case.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]