Truncated Newton method
Truncated Newton methods, also known as Hessian-free optimization,[1] are a family of optimization algorithms designed for optimizing non-linear function with large numbers of variables. A truncated Newton method consists of repeated application of an iterative optimization algorithm to approximately solve Newton's equations, to determine an update to the function's parameters. The inner solver is truncated, i.e., run for only a limited number of iterations. It follows that, for truncated Newton methods to work, the inner solver needs to produce a good approximation in a few number of iterations;[2] conjugate gradient has been suggested and evaluated as a candidate inner loop.[1] Another prerequisite is good preconditioning for the inner algorithm.[3]
References
Further reading
- Grippo, L.; Lampariello, F.; Lucidi, S. (1989). "A Truncated Newton Method with Nonmonotone Line Search for Unconstrained Optimization". J. Optimization Theory and Applications 60 (3). CiteSeerX: 10.1.1.455.7495.
- Nash, Stephen G.; Nocedal, Jorge (1991). "A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization". SIAM J. Optimization 1 (3): 358–372. CiteSeerX: 10.1.1.474.3400.
|
---|
| Publications | |
---|
| Other writings | |
---|
| Newtonianism | |
---|
| Life | |
---|
| Friends and family | |
---|
| Discoveries and inventions | |
---|
| Phrases | |
---|
| Theory expansions | |
---|
| Related | |
---|
|
|
---|
| | | | | | | | | | | |
- Categories
- Algorithms and methods
- Heuristics
- Software
|
|