[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Axiom-developer] Re: Documentation of Axiom
Re: [Axiom-developer] Re: Documentation of Axiom
Tue, 13 Dec 2005 12:18:24 -0800 (PST)
--- root <address@hidden> wrote:
> Axiom Volume 1: Tutorial is about to be published by lulu.com
> I have the "proof" version coming in the mail any day now.
> --patch-47 contains the source and pdf for the lulu version in
> Once the proof version has been reviewed it should be on sale.
> (the proceeds will likely go to the axiom foundation)
> I expect this to be available in the next week or so.
> Volume 9: Algebra is where I'm hoping you'll take the lead.
> Find research papers and PhD thesis work and get permission
> to use it for documentation of the theory. we have permission
> to use trager's thesis work and bronstein's thesis work which
> covers most of the integration theory. William Sit has done
> the PODE work and is a likely source of good theory as well
> as good code. Larry Lambe is also a good source along with
> Cliff Williamson, Patricia Gianni and numerous other "friends
> of axiom".
I've been wondering about that a little - many, many papers that might
be of interest to us are actually copyright by the journals that
published them. Does anybody know what the prospects are of convincing
one or more journals to contribute to the Axiom effort?
Obviously we can avoid copyright issues by re-expressing the knowledge
in the papers on our own (in a sense this is what I'm doing (or trying
to do) with the units package - incorporating design insights from
other work into a new work, and creating a new article in the process)
but that's a lot of work. The advantage is it forces one to think
about the subject and may result in a better implementation in the end.
> Volume 10: Numerics is in process and will also take a long
> time. I have permission from a couple authors of research
> papers in numerics to quote their work and I'm in the process
> of doing that documentation as well as looking for other
> sources of theory. there is a lot to learn here and the theory
> is most enlightening. And, oh by the way, i'm trying to get it
> to work correctly.
I know this is probably not something you've thought much about yet
Tim, but are we planning to add any kind of "awareness" of accuracy
limitations and error analysis to the numerical abilities of Axiom?
The reason I ask is because if the units effort ever gets finished the
next piece I want to look at is Error Analysis. What I would like to
be able to do is perform physical science calculations in Axiom and
have complete knowledge of and confidence in the uncertainty introduced
by experimental measurement, numerical approximations and uncertainty
induced by numerical calculations. I think with the units
implementation and a really solid error analysis system Axiom could
become a major tool for scientific computation.
Bascially, the first two things they introduce in experimental physics
are errors and units, and they are the most consistently important
ideas (and consistently ignored/fudged parts in undergraduate labs) in
experimental science. My hope is that Axiom can build a whole library
of scientific capabilities someday which incorporate and enforce
correctness in errors and units throughout, but obviously those two
components must be created first, and they must be as correct, well
designed and robust as we can make them. Symbolic computation
obviously doesn't deal with errors, but numerical does (even
implementing Real Numbers, a move I eagerly support, results in limits
due to hardware constraints if nothing else.) I know enough to know
it's not a simple subject, but it's probably worth doing if it can be
done. So I guess my main concern is: how difficult would it be to add
error analysis to Axiom's numerical routines, given how they are
currently designed/being designed?
> it is agonizingly slow and very tedious to do a quality job.
> and it is hard to "write for the reader" rather than "write
> for the machine". plus there is so much to understand if
> you're going to explain a small piece of code.
Oddly enough, I'm reassured and encouraged by this. The fact that
Axiom as a project accepts the time required for quality and is willing
to strive for that quality is one of the things that makes the project
> the "conventions", if you can call them that, are just what
> seemed to make sense at the time. since "exposing" my
> boilerplate here i've since learned a bit and changed how i
> do things. (e.g. \printindex) i agree that we need to discuss
> and complain about them as this is the only way to develop a
> group mindset. but the conventions are hard to invent
> without real situations. hyperlinking and the endpaper
> discussion is a good example. Bill Page's work is really
> driving a lot of new constraints on this.
Definitely. I still intend to try using pst-pdf and/or pdftricks as
well, to see if pdflatex can provide an alternative to the dvips route,
but I am much less worred about that now thanks to Bill's really
> eventually the interpreter, compiler, graphics, and browser
> will each be in their own books and not all globed together
> in src/interp. at which point we should be able to
> replace/upgrade/rewrite/extend the compiler without breaking
> the rest of the world.
Speaking of graphics, does anybody know of a good source for the
fundamental theories involved with producing good 2D and 3D plots? I
recall a case where the Maxima plotting routines were upgraded to
include a technique from Yacas, but I have no idea where these
techniques are documented. I expect Axiom has a lot of the key logic
in it already, but I am curious if we could add some features like
identification and marking of holes (for example). Then there is the
question of "presentation quality" graphics like those generated by VTK
and (to some extent) ZICLIB - lighting, surface reflectivity, etc.
Would the documentation of that sort of modeling technology be relevant
> in fact, we should probably put out a call for annotated
> entries of existing works that could be reference materials.
> constructing a good annotated computer algebra bibliography
> would be worthwhile in itself.
That brings up a question I should probably pose on Axiom-legal - what
is the copyright status on the bibtex entries that are generated by
online journal sites? Presumably the purpose of these is to be
included, so I'm not really inclined to worry myself, but what do y'all
> old latex/new latex? who knows. \eject was the only command
> i knew until you mentioned \newpage. i've started recoding
> \eject into \newpage when i open a source file for other
> changes. but either one works so it's purely a matter of
> style. we'll get flak eventually because THIS version of
> axiom today will use "OLD" commands tomorrow when latex17m
Heh. Probably true, assuming they do release a new latex at some
point. Hopefully, style updating won't be too much of a difficulty -
in my experience you can even use search and replace in some cases.
> i'd suggest that you try to write volume 9 as a real book that
> includes real algebra files with real documentation that still
> works in a real system and see what comes of the effort.
> clearly the conventions used in the algebra are going to be a
> dominant force throughout the rest of the system and you get
> to make them up as you go along. it may turn out that the
> algebra conventions are wildly different from the rest of the
> system because of your ALLPROSE technology. but that's ok as
> long as it builds, it works, and the algebra is fully
> volume 9 is probably a 5 year effort.
Tim, would the units work (once it is done) be part of volume 9? Or
should it along with error analysis, scientific constants, chemical
calculations, etc. be some hypothetical volume 10 "scientific
Also, is there some standard way for a pamphlet to extend the TeX
output routines? I really need to dig into the source code and explore
Axiom's TeX abilities but IIRC someone is already working on upgrading
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around