Thanks for that, really appreciate your help.
The problem I seem to have now is that the main fitting iteration
seems to just bail out on me immediately:
status = gsl_multifit_fdfsolver_iterate (s);
printf ("status = %s\n", gsl_strerror (status));
print_state (iter, s);
if (status) break;
status = gsl_multifit_test_delta (s->dx, s->x,
while (status == GSL_CONTINUE && iter < 500);
"if (status)" always evaluates to true (status == -2, I checked)
error string is actually "the iteration has not converged yet". If I
change this to:
if (status != -2), then it just keeps going until the maximum
iterations are up. So, status == GSL_CONTINUE, but the worrying
that at each pass the numbers output by print_state() are the same.
as if there is no gradient descent happening.
Am I missing something obvious? It doesn't seem to make any
what I put in x_init, be it the output of the guesstimator, 1s
some other random number. It never seems to descend.
I removed all the gsl_rng* stuff because I have a fixed data block
for testing so I can make meaningful comparisons between different
algorithms and libraries (i.e. content of y is pre-generated).
couldn't be the cause of the problem, could it?
My Jacobian setting is:
for (i = 0; i < n; i++)
// Yi = a * sin(X/b + c) + d
gsl_matrix_set (J, i, 0, sinf(b / i + c) / sigma[i]);
gsl_matrix_set (J, i, 1, a * cosf(b / i + c) / sigma[i] / i);
gsl_matrix_set (J, i, 2, a * cosf(b / i + c) / sigma[i]);
gsl_matrix_set (J, i, 3, 1 / sigma[i]);
But even if this is wrong, surely it should still descend
given random starting points, should it not?
On Wed, 18 Jul 2007, John Gehman wrote:
The Jacobian matrix J is the differential of your objective
with respect to each parameter (corresponding to the second index)
at each data point (corresponding to the first index).
The relevant equation is [a * sin(b / t + c) + d - yi ] / sigma
i.e. you're trying to minimize the difference between the model and
the data (yi), where some points may be more reliable than others
(given sigma[i]), by optimizing the amplitude, period, phase, and
offset of your sine wave. The Jacobian provides the analytic
equations to inform the system of the sensitivity of the
the evolving fit to changes in each of the parameters.
Taking the partial derivative of your equation with respect to each
of the floating parameters, and setting them appropriately into the
matrix J, assuming your vector of floating parameters elsewhere is
ordered (a,b,c,d), and retaining your s = sigma[i], I get:
gsl_matrix_set (J, i, 0, sin(b / t + c) / s); # derivative
with respect to a
gsl_matrix_set (J, i, 1, cos(b / t + c) * a/(t*s) ); # derivative
with respect to b
gsl_matrix_set (J, i, 2, cos(b / t + c) * a/s); # derivative
with respect to c
gsl_matrix_set (J, i, 3, 1/s); #
derivative with respect to d
The analytic derivatives are usually my problem, however, so please