Global Convergence and Implementation of NGTN Method for Solving Large-Scale Sparse Nonlinear Programming Problems

Global Convergence and Implementation of NGTN Method for Solving Large-Scale Sparse Nonlinear Programming Problems

Year:    2001

Journal of Computational Mathematics, Vol. 19 (2001), Iss. 4 : pp. 337–346

Abstract

An NGTN method was proposed for solving large-scale sparse nonlinear programming (NLP) problems. This is a hybrid method of a truncated Newton direction and a modified negative gradient direction, which is suitable for handling sparse data structure and possesses Q-quadratic convergence rate. The global convergence of this new method is proved, the convergence rate is further analysed, and the detailed implementation is discussed in this paper. Some numerical tests for solving truss optimization and large sparse problems are reported. The theoretical and numerical results show that the new method is efficient for solving large-scale sparse NLP problems.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/2001-JCM-8986

Journal of Computational Mathematics, Vol. 19 (2001), Iss. 4 : pp. 337–346

Published online:    2001-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    10

Keywords:    Nonlinear programming Large-scale problem Sparse.