A Note on the Nonlinear Conjugate Gradient Method

Authors

  • Yu-Hong Dai
  • Ya-Xiang Yuan

Keywords:

Unconstrained optimization, Conjugate gradient, Line search, Global convergence.

Abstract

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.  

Published

2021-07-01

Abstract View

  • 32864

Pdf View

  • 3610

Issue

Section

Articles

How to Cite

A Note on the Nonlinear Conjugate Gradient Method. (2021). Journal of Computational Mathematics, 20(6), 575-582. https://global-sci.com/index.php/JCM/article/view/11516