CONMIN | Authors | David F. Shanno
RUTCOR, Rutgers University New Brunswick, New Jersey, USA E-mail: shanno@farkas.rutgers.edu P.K.H. Phua Department of Information Systems and Computer Science, National University of Singapore, Kent Ridge, Singapore 1026, Singapore |
Language | FORTRAN | |
Algorithm | Limited memory conjugate gradient | |
Input Format | ||
Modeling Languages link | ||
Commercial Status | free (ask the authors) | |
Platform | Any machine with a reasonable amount of memory and a Fortran compiler | |
Remarks | CONMIN is a very known subroutine implementing a two-step limited memory quasi Newton like Conjugate Gradient method with Beale restarts.
Only seven vectors of starage are necessary. The step sizes are obtained by the Davidon’s cubic interpolation method to satisfy the Wolfe conditions. CONMIN uses two couples of vectors to build the current approximation of the Hessian matrix. Extensive numerical comparisons between CONMIN and other limited memory methods arte given in: Dong C. Liu and Jorge Nocedal, On the Limited Memory BFGS method for Large Scale Optimization. Mathematical Programming, vol.45, 1989, pp.503-528. See also: | |
References | Shanno, D.F., Phua, K.H., ALGORITHM 500. Minimization of Unconstrained Multivariable Functions [E4], ACM TOMS, vol.2, no.1, March 1976, pp.87-94.
Shanno, D.F., Conjugate gradient methods with inexact searches. Mathematics of Operations Research, vol.3, no.3, August 1978, pp.244-256. X.Zou, I.M.Navon, M. Berger, K.H. Phua, T. Schlick, F.X. Le Dimet, Numerical experience with limited memory quasi Newton and truncated Newton methods. SIAM Journal on Optimization, vol.3, no.3, August 1993, pp.582-608. I.M. Navon, P.K.H. Phua, M. Ramamurthy, Vectorization of Conjugate-Gradient Methods for Large-Scale Minimization in Meteorology. JOTA, Vol.66, No.1, pp.71-93, July 1990. |