[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: SV: [Help-gsl] On conjugate gradient algorithms in multidimentional
Re: SV: [Help-gsl] On conjugate gradient algorithms in multidimentional minimisation problems.
Mon, 05 Dec 2005 12:30:31 +0100
Thunderbird 1.5 (X11/20051025)
after extensive tests, here are the results. The tests were run on a
simplified problem with an absolute minimum number of parameters (16)
without any constraints.
For comparison, I ran the test on the Fletcher-Reeves, Polak-Ribiere,
vector Broyden-Fletcher-Goldfarb-Shanno and a mm_hess algorithm [kindly
provided by James in a private communication, I hope it makes it into
The points at which the different algorithms bailed out are very
close to each other in parameter space.
Convergence was analyzed by recording the chi squared vs iteration #.
Fletcher-Reeves: small plateau at the start, large drop in chi
squared, small plateau, bailed out. Chi squared reached: 6.54 by
Polak-Ribiere: 4 plateaus on the way, reached chi squared 6.53 by
BFGS: 6 plateaus on the way, reached chi squared 6.53 by iteration 7800.
mm_hess: no plateaus, nice curve like 1/iteration #, reached chi
squared 6.51 by iteration 23'300.
So far, it seems that the algorithms tend to the same point, none can
actually converge. mm_hess takes more iterations, but finds a better chi
squared, and if one measures stability by an absence of plateaus, this
is a nice method, which, I hope, will be available in James' mlib some
I will run some tests on a problem with constraints, and see how the
different algorithms fare there.
Martin Jansche wrote:
On 11/29/05, Max Belushkin <address@hidden> wrote:
James, thank you, I will certainly give it a go in the next couple of
days, and will let you know how it works out
Please share your findings once you had a chance to try different
strategies. Another option would be try the optimizers in the TAO
- Re: SV: [Help-gsl] On conjugate gradient algorithms in multidimentional minimisation problems.,
Max Belushkin <=