lynx-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: lynx-dev dev.15 patch 5 - mostly TRST


From: Leonid Pauzner
Subject: Re: lynx-dev dev.15 patch 5 - mostly TRST
Date: Sun, 28 Nov 1999 19:12:36 +0300 (MSK)

28-Nov-99 08:39 Klaus Weide wrote:
> This would be dev.15 patch 4 except that Leonid (with his much appreciated
> patch) stole the number. :)  Should be applied after
:)

> * Limit span values accepted for TRST with TRST_MAXCOLSPAN and
>   TRST_MAXROWSPAN, which can be changed in userdefs.h.  Without imposing
>   a limit, attempts to trick lynx into allocating huge blocks of memory
>   (which might cause thrashing without apparent reason) with something
>   like ROWSPAN=10000000 are just too easy.

At least we already have a LINESIZE parameter so TRST_MAXCOLSPAN has an
upper boundary.

> * Moved definition of SAVE_TIME_NOT_SPACE to userdefs.h.  You may
>   want to undefine it for a platform where running out of memory is a
>   frequent problem (DOS?), although the effect won't be very pronounced.
Never: DJGPP use dpmi server which create virtual memory up to 256Mb,
until the disk space is available, of cause this may affect speed...

I have not noticed any visible difference with/without SAVE_TIME_NOT_SPACE
on my slow 386 machine under DOS.

>   Used in TRST code to affect size of some allocations (also used as before
>   for HTSprintf0/HTSprinf).


> Index: 2.23/userdefs.h
> --- 2.23/userdefs.h Thu, 25 Nov 1999 11:29:56 -0600
> +++ 2.23(w)/userdefs.h Sun, 28 Nov 1999 06:36:36 -0600
> @@ -1337,6 +1337,9 @@
>  #endif

>  #define MAXCHARSETS 60               /* max character sets supported */
> +#define TRST_MAXROWSPAN 10000        /* max rowspan accepted by TRST code */
> +#define TRST_MAXCOLSPAN 1000 /* max colspan and COL/COLGROUP span accepted */
> +#define SAVE_TIME_NOT_SPACE  /* minimize number of some malloc calls */

>  /* Win32 may support more, but old win16 helper apps may not. */
>  #if defined(__DJGPP__) || defined(_WINDOWS)




reply via email to

[Prev in Thread] Current Thread [Next in Thread]