Algorithms for Nonlinear Least Square
Minimization
The coefficients of the nonlinear model are obtained by the least square minimization of
either the residuals or the log likelihood function for the model. There are many algorithms
that can perform this nonlinear least square minimization, which use iterative numerical methods
to get the coefficients using the data points (X,Y).
In this section, we describe few representative algorithms namely
Newton's method, Gauss Newton algorithm,
Levenberg-Marquardt algorithm and
the method of iteratively reweighted least squares. Each algorithm
is described
in detail with a numerical example. In the beginning, the mathematical tools like Jacobian and Hessian matrices
used in the algorithm are described. This aim of this section is to give an idea about the methods used by describing
few representative algorithms.
Click the links below to sequentially go through the topics one by one
1. Mathematical concepts required
2. Gradient descsnt algorithm for nonlinear regression
3. Newton's method for approximate solutions to nonlinear equations
4. Gauss-Newton method for nonlinear least square minimization
5. Levenberg-Marquardt method for nonlinear least square minimization