A Note on the Nonlinear Conjugate Gradient Method

A Note on the Nonlinear Conjugate Gradient Method

Year:    2002

Author:    Yu-Hong Dai, Ya-Xiang Yuan

Journal of Computational Mathematics, Vol. 20 (2002), Iss. 6 : pp. 575–582

Abstract

The conjugate gradient method for unconstrained optimization problems varies with a scalar. In this note, a general condition concerning the scalar is given, which ensures the global convergence of the method in the case of strong Wolfe line searches. It is also discussed how to use the result to obtain the convergence of the famous Fletcher-Reeves, and Polak-Ribiére-Polyak conjugate gradient methods. That the condition cannot be relaxed in some sense is mentioned.  

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/2002-JCM-8942

Journal of Computational Mathematics, Vol. 20 (2002), Iss. 6 : pp. 575–582

Published online:    2002-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    8

Keywords:    Unconstrained optimization Conjugate gradient Line search Global convergence.

Author Details

Yu-Hong Dai

Ya-Xiang Yuan