web-hurd
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Linux Cross Reference of the Hurd


From: Ian Duggan
Subject: Linux Cross Reference of the Hurd
Date: Thu, 25 Oct 2001 17:19:21 -0700

I've placed samples of my LXR work with the Hurd up on my personal site. These
links are not permanent, but are for demonstration purposes.


ORIGINAL

        http://www.ianduggan.net/hurd/hurdlxr-original/source/

This is the plain mod_perl lxr setup. These pages are being generated
dynamically and all the form elements are enabled. This is working straight from
a checkout of the Hurd source. It is also possible to have it aimed at a CVS
tree and have it have multiple versions available at once using the CVS
mechanisms.



DYNAMIC -- tailored for staticizing

        http://www.ianduggan.net/hurd/hurdlxr-dynamic/source/

This is a version I created for use in creating a static version of the site. I
have edited the scripts to remove any form elements or dynamic searches. This is
really only the /source/ section of the ORIGINAL as well as the /ident/ stuff.



STATIC -- 100% static html

        http://www.ianduggan.net/hurd/hurdlxr-static/source/

This is a 100% static version of the DYNAMIC site. I wrote a "staticizer" script
which spiders a URL and rewrites the links and filenames to allow static
browsing. It does things like turning cgi-scripts into directories, rewriting
files like parameter.c into parameter.c.html, and squashing characters like ?
and & in urls into directories as well. I will eventually release the staticizer
script after I have time to polish it a bit. It is not specific to the Hurd.


I initially started this project so that we could have a static version of cross
referenced code that could be placed on the Hurd webpages and mirrored. The
static version takes up about 223MB and and took 2 hours to generate on my dual
PIII 450 with lots of ram. It took substantially longer on my webserver which is
a single processor PIII 450 with only 64MB. I believe it was thrashing memory
the whole time. A server with more memory would do better. The script itself
only uses about 20MB at peak (table of links it's seen already). It was the rest
of the stuff on the machine taking up the memory. The 20MB could be reduced by
tying the perl hash to a dbm file.

The static version was the initial goal, but it seems that we can now use CGIs
at Savanah (ideally mod_perl?). Given the size of the static version, perhaps a
dynamic version would be better. The LXR scripts currently need Mysql or
Postgres to work. Are either of these available at Savannah? If not, someone has
suggested looking at SQLLite which does not require a daemon.

A suggestion was also made to try to offer this as a service for all of
Savannah, but I would like to try it with just the Hurd first.

Comments?

-- Ian

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Ian Duggan                    address@hidden
                              http://www.ianduggan.net



reply via email to

[Prev in Thread] Current Thread [Next in Thread]