Useful Definitions


 
Hessian matrix
Powell's method
correlation matrix
covariance matrix
gradient descent

The Hessian is an NxN matrix of second partial derivatives of a function of N parameters. I.e., element (i,j) is partial-i x partial-j of f(p1, p2, ... pN). For our purposes, we are concerned with the Hessian of the error surface function evaluated at  the optimal parameter values.
Powell's method is a kind of (but not exactly) gradient descent optimization algorithm used to fit functions to data. See Numerical Recipes in C.

The corellation matrix is an NxN matrix whose elements are calculated from the covariance matrix as

This matrix tells us to what degree the variables are correlated.  The entries can be taken as a measure of the "orthogonality" of the given parameters. Zero correlation between two parameters or variables means no interaction.  Conversely, non-zero correlation means that the error surface has local iso-error contours (they are hyperellipses) that are rotated with respect to those axes. In other words, the error surface is not separable for those parameters.

The covariance matrix is an NxN matrix that is the matrix inverse of the Hessian matrix. The elements down the diagonal give us the variance in our estimate of each optimal parameter value, under the assumption that fitting error is Gaussian. Thus a 95% confidence interval estimate for parameter i would be

The covariance matrix also tells us about the independence of two variables (e.g. element (i,j) is the covariance between parameters i and j). The correlation matrix is probably better for this purpose, and is derived from the covariance matrix.

Gradient descent is a term often used to refer to a class of function fitting methods in which a parameter vector is repeatedly and incrementally perturbed so as to reduce the error in the fit of a the parameterized function to the data. The direction of perturbation is (generally intended to be) determined by the direction of the error surface gradient. Most algorithms use a modified form of gradient descent for practical reasons.