Global Convergence and Implementation of NGTN Method for Solving Large-Scale Sparse Nonlinear Programming Problems
Abstract
An NGTN method was proposed for solving large-scale sparse nonlinear programming (NLP) problems. This is a hybrid method of a truncated Newton direction and a modified negative gradient direction, which is suitable for handling sparse data structure and possesses Q-quadratic convergence rate. The global convergence of this new method is proved, the convergence rate is further analysed, and the detailed implementation is discussed in this paper. Some numerical tests for solving truss optimization and large sparse problems are reported. The theoretical and numerical results show that the new method is efficient for solving large-scale sparse NLP problems.
About this article
Abstract View
- 33003
Pdf View
- 3639
How to Cite
Global Convergence and Implementation of NGTN Method for Solving Large-Scale Sparse Nonlinear Programming Problems. (2001). Journal of Computational Mathematics, 19(4), 337-346. https://global-sci.com/index.php/JCM/article/view/11436