The algorithms presented here can be applied to quadratic and non-quadratic
objective functions alike. The term **local** refers
both to the fact that only information about a function from the
neighborhood of the current approximation is used in
updating the approximation as well as that we usually expect such
methods to converge to whatever local extremum is closest
to the starting approximation. As a result,
the global structure of an objective function
is unknown to a local method.
Some of these techniques, such
as Downhill Simplex
and Powell's method do not require
the derivatives of the objective function.
Others, such as the quasi-Newton
methods
require at least the gradient.
In the latter case, if analytic
expressions are not available for the derivatives, a module for
finite-difference calculation of the gradient is provided.
*COOOL* also includes nonquadratic generalizations
of the conjugate gradient method
incorporating two different kind of line search
procedures.

Sun Feb 25 12:08:00 MST 1996