[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gcl-devel] make-databases

From: Camm Maguire
Subject: Re: [Gcl-devel] make-databases
Date: 31 Oct 2003 10:51:25 -0500
User-agent: Gnus/5.09 (Gnus v5.9.0) Emacs/21.2


root <address@hidden> writes:

> Camm,
> First, move to where the .NRLIB files are.
> The databases are built using the *.NRLIB/info files.
>   cd int/algebra
> Then start interpsys and give it the lisp command:
>   )lisp (make-databases "" nil) | (yourpath)/obj/bin/interpsys
> I'm still pondering your "split" request for database file loads.

Thanks, Tim.  Please feel free to say it is too difficult if that is
the case.  It is likely the shortest route to axiom on mips(el),
alpha, ia64, and hppa, but in principle we should build a bfd backend
for those targets at some point, and this may provide incentive :-).
The latter is a large task, and I likely will not have time for it
until 2004, but should be doable, though ugly.  The following page
describes the state of bfd relocation pretty well:


Apparently these newer target machines are using the 'new linker'
mechanism which bypasses the bfd_get_relocated_section_contents
strategy we're using entirely.  Instead, they provide a 'backend'
specific 'relocate_section' function, an example call of which can be
found in the elflink.h file in the binutils/bfd subdir of the gcl
source tree.  While in principle such a call might work across the
board, it appears to use quite a few auxiliary structures which
require building before the call.  Energetic volunteers appreciated!
Perhaps if anyone feels motivated, they could fork the sfaslbfd.c file
into sfaslbfd_new.c, and rewrite the section relocation calls using
bfd's 'new' linker strategy.  (While I'm sure he is too busy, we do
have such a bfd-knowledgeable person on the gcl team, right Aurelian?

Take care,

> In more detail: 
> The make-databases function walks across the NRLIB/info files 
> extracting information for documentation (e.g. the browse.daase)
> or other uses. So, in the case of browse.daase, it extracts
> documentation strings, compresses them (see src/interp/daase.lisp),
> and writes them out into a random access file format. 
> The random access file format has 3 parts: 
> an initial pair, 
> a sequence of s-expressions, 
> a list of lists. 
> The list of lists is the index. To look something up in one of these
> files you read the first number (N), seek N bytes into the file, and
> do a (read). Read will return the list of lists.  For each domain
> there is a list. 
> Each list consists of indexes into the file (any non-negative number)
> or immediate data (a symbol, a string, or a negative number). If the
> list item is a number (M) you start from the top of the file, seek M
> bytes into the file and call (uncompress (read)).
> I'm sure there are more details I missed but the src/interp/daase.lisp
> has the actual code.
> The split request is going to be a challenge. I think I need to figure
> out how to dynamically add one file to the databases. Once I can do
> this I can split the database build into any convenient step size.
> Tim
> _______________________________________________
> Gcl-devel mailing list
> address@hidden
> http://mail.gnu.org/mailman/listinfo/gcl-devel

Camm Maguire                                            address@hidden
"The earth is but one country, and mankind its citizens."  --  Baha'u'llah

reply via email to

[Prev in Thread] Current Thread [Next in Thread]