# Gauss–Newton algorithm

﻿
Gauss–Newton algorithm

The Gauss–Newton algorithm is a method used to solve non-linear least squares problems. It can be seen as a modification of Newton's method for finding a minimum of a function. Unlike Newton's method, the Gauss–Newton algorithm can only be used to minimize a sum of squared function values, but it has the advantage that second derivatives, which can be challenging to compute, are not required.

Non-linear least squares problems arise for instance in non-linear regression, where parameters in a model are sought such that the model is in good agreement with available observations.

The method is named after the mathematicians Carl Friedrich Gauss and Isaac Newton.

## Description

Given m functions r1, …, rm of n variables β = (β1, …, βn), with m ≥ n, the Gauss–Newton algorithm finds the minimum of the sum of squares $S(\boldsymbol \beta)= \sum_{i=1}^m r_i^2(\boldsymbol \beta).$

Starting with an initial guess $\boldsymbol \beta^{(0)}$ for the minimum, the method proceeds by the iterations $\boldsymbol \beta^{(s+1)} = \boldsymbol \beta^{(s)}+\Delta,$

where Δ is a small step. We then have $S(\boldsymbol \beta^{(s)} + \Delta) = S(\boldsymbol \beta^{(s)}) + \left[\frac{\partial S}{\partial \beta_i}\right] \Delta + \frac{1}{2} \Delta^\top \left[\frac{\partial^2 S(\beta)}{\partial \beta_i\partial \beta_j}\right] \Delta$.

If we define the Jacobian matrix $\mathbf{J_r}(\boldsymbol \beta) = \left.\frac{\partial r_i}{\partial \beta_j}\right|_{\boldsymbol \beta}$,

we can replace $\left[\frac{\partial S}{\partial \beta_i}\right]$ with $\mathbf{J_r}^\top \mathbf{r}$

and the Hessian matrix in the right can be approximated by $\mathbf{J_r}^\top \mathbf{J_r}$ (assuming small residual), giving: $S(\boldsymbol \beta^{(s)} + \Delta) \approx S(\boldsymbol \beta^{(s)}) + \mathbf{J_r}^\top \mathbf{r}\Delta + \frac{1}{2} \Delta^\top \mathbf{J_r}^\top \mathbf{J_r}\Delta$.

We then take the derivative with respect to Δ and set it equal to zero to find a solution: $S'(\boldsymbol \beta^{(s)} + \Delta) \approx \mathbf{J_r}^\top \mathbf{r} + \mathbf{J_r}^\top \mathbf{J_r} \Delta = 0$.

This can be rearranged to give the normal equations which can be solved for Δ: $\left(\mathbf{J_r}^\top \mathbf{J_r} \right)\Delta = - \mathbf{ J_r} ^\top \mathbf{r}.$

In data fitting, where the goal is to find the parameters β such that a given model function y = f(x, β) fits best some data points (xi, yi), the functions ri are the residuals $r_i(\boldsymbol \beta)= y_i - f(x_i, \boldsymbol \beta).$

Then, the increment Δ can be expressed in terms of the Jacobian of the function f, as $\left( \mathbf{ J_f}^\top \mathbf{J_f} \right)\Delta = \mathbf{J_f}^\top \mathbf{r}.$

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• Algorithme de Gauss-Newton — En mathématiques, l algorithme de Gauss Newton est une méthode de résolution des problèmes de moindres carrés non linéaires. Elle peut être vue comme une modification de la méthode de Newton dans le cas multidimensionnel afin de trouver le… …   Wikipédia en Français

• Algoritmo de Gauss-Newton — En matemáticas, el algoritmo de Gauss Newton se utiliza para resolver problemas no lineales de mínimos cuadrados. Es una modificación del método de optimización de Newton que no usa segundas derivadas y se debe a Carl Friedrich Gauss. El problema …   Wikipedia Español

• Newton's method in optimization — A comparison of gradient descent (green) and Newton s method (red) for minimizing a function (with small step sizes). Newton s method uses curvature information to take a more direct route. In mathematics, Newton s method is an iterative method… …   Wikipedia

• Newton's method — In numerical analysis, Newton s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real valued function. The… …   Wikipedia

• Levenberg–Marquardt algorithm — In mathematics and computing, the Levenberg–Marquardt algorithm (LMA) provides a numerical solution to the problem of minimizing a function, generally nonlinear, over a space of parameters of the function. These minimization problems arise… …   Wikipedia

• List of topics named after Carl Friedrich Gauss — Carl Friedrich Gauss (1777 ndash; 1855) is the eponym of all of the topics listed below. Topics including Gauss *Carl Friedrich Gauss Prize, a mathematics award *Degaussing, to demagnetize an object *Gauss (unit), a unit of magnetic field (B)… …   Wikipedia

• Isaac Newton — Sir Isaac Newton …   Wikipedia

• BHHH algorithm — BHHH is an optimization algorithm in econometrics similar to Gauss–Newton algorithm. It is an acronym of the four originators: Berndt, B. Hall, R. Hall, and Jerry Hausman.UsageIf a nonlinear model is fitted to the data one often needs to estimate …   Wikipedia

• Criss-cross algorithm — This article is about an algorithm for mathematical optimization. For the naming of chemicals, see crisscross method. The criss cross algorithm visits all 8 corners of the Klee–Minty cube in the worst case. It visits 3 additional… …   Wikipedia

• Expectation-maximization algorithm — An expectation maximization (EM) algorithm is used in statistics for finding maximum likelihood estimates of parameters in probabilistic models, where the model depends on unobserved latent variables. EM alternates between performing an… …   Wikipedia