Hello list,
I would like to use a debug hook with the truetype bytecode interpreter to calculate the maxp table's `maxStackElements` field, which is defined as the "maximum stack depth", including "Font and CVT Programs, as well as the instructions for each glyph".
I'm trying to emulate what MS Visual TrueType (VTT) does when one runs the command "Recalc maxp values" from its graphical interface.
I've looked at how the freetype2-demos' ttdebug, as well as FontForge's TT debugger work, and below is what I've got so far.
Please note that my C is quite rudimentary -- I plan to move this code to Python at some point --, but I think you get the point:
```
#define CUR (*exc)
struct debugger_context {
FT_Long maxStackSize;
};
static struct debugger_context *global_debugger_context;
static FT_Error test_debug_hook( TT_ExecContext exc ) {
struct debugger_context *dc = global_debugger_context;
FT_Error error = 0;
CUR.instruction_trap = 1;
while ( CUR.IP < CUR.codeSize ) {
error = TT_RunIns( exc );
if ( error != 0 )
break;
if ( CUR.top > dc->maxStackSize )
dc->maxStackSize = CUR.top;
}
return error;
}
```
This seems to work, in the sense that the computed value is big enough for both FreeType and MS rasterizers to render the font. If I set it to anything less than that value, for example, the interpreter probably runs out of memory and rejects the instructions.
I noticed however that the value as computed by VTT seems to be always a bit greater than the one I get from the code above, usually around 80-90 bytes greater (it varies from font to font).
I don't know why that is the case, and I wonder if someone else in this list could shed some light on this?
Thank you for your support.
All best,
--
Cosimo Lupo