[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Help-gsl] Overhaul of GSL nonlinear least squares

From: Patrick Alken
Subject: [Help-gsl] Overhaul of GSL nonlinear least squares
Date: Sat, 2 Jul 2016 16:40:55 -0600
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Thunderbird/38.8.0

Hello all,

  The nonlinear least squares capabilities of GSL have been completely
rewritten. The old interface is still available under
gsl_multifit_fdfsolver. But this is now obsolete, and the new interface
is called *gsl_multifit_nlinear*. The new interface is based completely
on trust-region methods, and gives the user much more control over how
the iteration proceeds. Additionally, more trust-region methods are
available to the user to quickly switch between. The old library offered
only a Levenberg-Marquardt method. The new module offers:

Levenberg-Marquardt with geodesic acceleration
Double dogleg

Additionally I'm currently working on a 2D subspace method (sort of a
generalized dogleg method) but its not ready yet.

For small/moderate sized problems, where the user can provide the full
Jacobian matrix, the matrix must be factored during each iteration. The
old library offered only a QR factorization. The new module allows the
user to choose between Cholesky, QR, and SVD. Cholesky is the fastest
but also the least accurate when the Jacobian is rank deficient or
near-singular. The SVD is the slowest but most accurate in the rank
deficient case. QR is in between.

The new library also offers a brand new interface for very large
problems, called *gsl_multilarge_nlinear*. Here, instead of providing
the full Jacobian matrix to the solver, the user must provide a routine
which multiples a given vector v as J*v or J^T * v. This way the user
can solve problems where the full Jacobian matrix is too large to fit in
memory, and they can take full advantage of any sparse structure in
their Jacobian to speed things up. Currently, this interface offers only
1 solver, based on the Steihaug-Toint algorithm, another generalized
dogleg method which uses a conjugate gradient iterative method to solve
for the Gauss-Newton step. I'm hoping to add the other methods soon (LM,
dogleg, etc).

I believe these updates now make GSL highly competitive with other
nonlinear least squares libraries (in fact I haven't seen any other
libraries which offer all the features and algorithms GSL does now).

Currently only unconstrained optimization is available - hopefully one
day I can get around to adding support for constrained problems.

Everything is documented in the manual, with several example programs
(on the git repository). For those of you solving nonlinear least
squares problems, I'd appreciate any feedback on the new interface.

Happy 4th of July,

reply via email to

[Prev in Thread] Current Thread [Next in Thread]