Gradient method

In optimization, gradient method is an algorithm to solve problems of the form

\min_{x\in\mathbb R^n}\; f(x)

with the search directions defined by the gradient of the function at the current point. Examples of gradient method are the gradient descent and the conjugate gradient.

See also

References

  • Elijah Polak (1997). Optimization : Algorithms and Consistent Approximations. Springer-Verlag. ISBN 0-387-94971-2. 
This article is issued from Wikipedia - version of the Monday, March 18, 2013. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.