[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Working on nnet package

From: Mohammed Elmusrati
Subject: RE: Working on nnet package
Date: Wed, 6 Mar 2019 15:57:03 +0200



Yes, I totally agree.. Machine learning is state of the art now and it would be great to develop more functions to support this extremely important area. Since I use Octave in my Machine Learning course, I have written few functions for my students to implement ANN for example. They work fine under Octave as well as Matlab. But there are not similar to Matlab ANN toolbox structure. I may add them to Github so wider range of users may use them.


Question: Is it necessary for the developed packages to be identical as Matlab toolboxes? Is it possible to have Octave style CNN or Deep learning packages?


Thank you


M. Elmusrati




Sent from Mail for Windows 10


From: vrozos
Sent: Wednesday, March 6, 2019 3:40 PM
To: address@hidden
Subject: Working on nnet package


The development of the NNET package by the original author, Michael Schmid,

stopped in 2010. Three years ago Francesco Faccio [1] volunteered to

continue the development. However, NNET is still stuck on version 0.1.13,

Last Release Date: 2010-12-02. As a result, the recent changes in Octave and

the deprecation of some functions (e.g. finite) have render this package

broken. This is a shame. NNET is not state of the art, but it is the only

readily available option for NN in Octave. Currently, it is useless to the

average Octave user only because of some very minor issues.


I would like to contribute to this package doing some minor development

regarding the amount of work, but critical regarding the functionality (e.g.

replace deprecated functions and symbols, improve documents, etc.).


I am familiar with Mercurial and Github, but I don't know how Sourceforge





Evangelos Rozos










Sent from: http://octave.1599824.n4.nabble.com/Octave-Maintainers-f1638794.html



reply via email to

[Prev in Thread] Current Thread [Next in Thread]