A Note on the Nonlinear Conjugate Gradient Method
Abstract
The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.
About this article
Abstract View
- 32946
Pdf View
- 3652
How to Cite
A Note on the Nonlinear Conjugate Gradient Method. (2021). Journal of Computational Mathematics, 20(6), 575-582. https://global-sci.com/index.php/JCM/article/view/11516