[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Libann-dev] Re: ANN++
[Libann-dev] Re: ANN++
30 Aug 2002 03:48:11 -0500
> Pretty much all of my experience with neural networks has been with
> multi-layer perceptron networks. Therefore ANN++ is horribly biased
> toward such networks. Ideally, ::Network would be the base class for
> all types of neural nets, but I know that isn't possible in its current
> form and I'm not sure it is possible at all.
> This seems to contradict your generality principle. For instance a
> MLP can never have a connection to a node in the same layer. Does
> your library trap this?
Although generality was my aim, my limited background in non-mlp
networks lead to bias. And no, I don't restrict users' ability to
connect nodes within the same layer.
> I'm thinking about a design where each layer would be implemented in
> terms of matrices, but there would be an interface in terms of
> neurons and connections.
> At a later stage, one could implement a gui to display a graphical
> representation of the network and see its weights and connections.
> However, I can't see that it would be of any serious utility other
> than perhaps a teaching tool for use with very simple nets. A nice
> gimmick though.
Ah, the silly blinking gui which shows you exactly where the activations
on in your net: I too wish to build such a worthless tool. On a more
serious note, wholeheartedly agree with the matrix-per-layer with
neuron/connection interface idea.
> The NN library wouldn't actually depend upon Octave itself, just one
> of it's libraries. Another option would be to use gsl (GNU Scientific
> Library), bu the eigenvector method in this library is documented as
> ineffecient for large matrices.
I'm actually a co-author of the Scythe Statistical Library
(scythe.wustl.edu_, which contains a very STL-friendly and general
template matrix class (as of the 0.3 release which is a rewrite I put
together of the original code and which should come out within the next
month - code is done, docs aren't). We've implemented quite a bit of
linear algebra, decompositions, etc for the library, although we haven't
included eigenvectors yet. A grad student at the University of
Washington with background in numerical methods is supposed to be
working on eigenvectors/values and svd as we speak, but I don't know
when s/he will finish. I'm open to octave as well. The gsl is a
wonderful library, but is a real pain to deal with in my opinion. It
doesn't strive to make life easy for the user.
> But presumably it remains the user's responsibility to ensure that the
> algorithm is appropriate to the architecture??
> I probably need to look at your code a little more closely. I was
> confused by this area. Surely *every* connection is `weighted' ?? A
> connection without a weight is the same as no connection at all.
Yes, user's responsibility. As for unweighted connections, there is a
base ::Connection class that takes care of most of the underlying
structural responsibilities. It is technically weighted (w = 1)
although it has no instance variable specifying a weight. I mostly
split them up so people could inherit the very basics of connections if
they wished to create connections that didn't fit exactly with my
> * Optimised for parallel processing.
This is a tricky one. We want a linear version that isn't contaminated
by a reliance on threading or message-passing libraries, but we need to
design in such a way that plugging on a parallel interface is trivial.
> * A low level interface where networks can be constructed piecemeal.
> * A higher level interface at which networks are completely
> constructed and useable at instantiation.
> * Support for the common types of network, including MLP, Kohonen, and
> Boltzmann machines.
> * The ability to dynamically modify a network where feasible ( for
> instance to add, delete nodes).
> * The ability to `disable' nodes in order to investigate redundancy.
> * The ability to easily persist a network (preferably in a machine
> independent format).
> * The ablitity to easily substitute training algorithms to investigate
> their effectiveness.
Ensure that it is easy to reach into the network structures to get
diagnostics from the network such as activation levels of specific
neurons. This will make it easier to create tools for tweaking network
performance and it won't hurt for the little visualization app either.
Also, push as much of the traits that all networks share down into the
bowels of the system. This will be great for us in terms of reusable
code and will help other developers (who we hope will add algorithms to
the library) see where the base library ends and each algorithm begins.
These are both design characteristics rather than user-oriented goals,
but they are design goals if you are thinking in terms of long-term
| Daniel Pemstein |
| Email: address@hidden, address@hidden |
| Web: http://www.danpemstein.com |
| PGP: http://www.danpemstein.com/files/pgp.txt |
Description: This is a digitally signed message part