Unconstrained Optimization Packages

TN Author Stephen G. Nash
ORAS Department
George Mason University
Fairfax, VA 22030
Phone: (703) 993-1678
E-mail: snash@gmu.edu
Language FORTRAN
Algorithm Truncated Newton & Conjugate Gradient
Input Format
Modeling Languages link
Commercial Status free
Platform Any machine with a reasonable amount of memory and a Fortran compiler
Remarks TN uses a truncated-Newton method based on a line search. Truncated-Newton methods compute an approximation to the Newton direction by approximately solving the Newton equations using an iterative method. In this software, the conjugate gradient method is used as the iterative solver.
Extensive numerical comparisons between TN and other limited memory methods arte given in:
  • Stephen G. Nash and Jorge Nocedal, A numerical study of the limited memory BFGS method and the truncated Newton method for large scale optimization. SIAM Journal on Optimization, vol.1, no.3, August 1991, pp.358-372, and
  • Neculai Andrei, Advanced Mathematical Programming - Theory, Computational Methods, Applications, Technical Publishing House, 1999 (Chapter 14), (in Romanian).
  • References

    S. G. Nash, Newton-like minimization via the Lanczos method, SIAM J. Numer. Anal. 21 (1984), pp. 770--788.
    S. G. Nash, User's guide for TN/TNBC, Technical Report 397, Department of Mathematical Sciences, The Johns Hopkins University, Baltimore, 1984.
    X.Zou, I.M.Navon, M. Berger, K.H. Phua, T. Schlick, F.X. Le Dimet, Numerical experience with limited memory quasi Newton and truncated Newton methods. SIAM Journal on Optimization, vol.3, no.3, August 1993, pp.582-608.

    “A Survey of Truncated-Newton Methods” , Journal of Computational and Applied Mathematics , 124 (2000), pp. 45-59.