... | ... | @@ -25,27 +25,27 @@ The Gauss-Newton algorithm minimizes the sum of squared residuals by iteratively |
|
|
|
|
|
1. **Define the Residuals**: The residuals $\mathbf{r}(\theta)$ represent the difference between the observed values $y$ and the predicted values $\hat{y}(\theta)$:
|
|
|
|
|
|
$$
|
|
|
\mathbf{r}(\theta) = y - \hat{y}(\theta)
|
|
|
$$
|
|
|
$$
|
|
|
\mathbf{r}(\theta) = y - \hat{y}(\theta)
|
|
|
$$
|
|
|
|
|
|
The goal is to minimize the sum of squared residuals.
|
|
|
The goal is to minimize the sum of squared residuals.
|
|
|
|
|
|
2. **Compute the Jacobian Matrix**: The Jacobian $J$ is a matrix of partial derivatives of the residuals with respect to the parameters. If $r_i(\theta)$ is the $i$th residual, the Jacobian is:
|
|
|
|
|
|
$$
|
|
|
J_{ij} = \frac{\partial r_i(\theta)}{\partial \theta_j}
|
|
|
$$
|
|
|
$$
|
|
|
J_{ij} = \frac{\partial r_i(\theta)}{\partial \theta_j}
|
|
|
$$
|
|
|
|
|
|
The Jacobian describes how each residual changes with respect to changes in the parameters.
|
|
|
The Jacobian describes how each residual changes with respect to changes in the parameters.
|
|
|
|
|
|
3. **Update the Parameters**: Use the Gauss-Newton update rule:
|
|
|
|
|
|
$$
|
|
|
\theta^{(t+1)} = \theta^{(t)} - (J^T J)^{-1} J^T \mathbf{r}(\theta^{(t)})
|
|
|
$$
|
|
|
$$
|
|
|
\theta^{(t+1)} = \theta^{(t)} - (J^T J)^{-1} J^T \mathbf{r}(\theta^{(t)})
|
|
|
$$
|
|
|
|
|
|
This update adjusts the parameters $\theta$ by solving a linear approximation to the least squares problem at each iteration.
|
|
|
This update adjusts the parameters $\theta$ by solving a linear approximation to the least squares problem at each iteration.
|
|
|
|
|
|
4. **Iterate Until Convergence**: Repeat the parameter update until the change in the sum of squared residuals or the parameters themselves becomes smaller than a pre-defined threshold, indicating convergence.
|
|
|
|
... | ... | |