A New Stepsize for the Steepest Descent Method

A New Stepsize for the Steepest Descent Method

Year:    2006

Journal of Computational Mathematics, Vol. 24 (2006), Iss. 2 : pp. 149–156

Abstract

The steepest descent method is the simplest gradient method for optimization. It is well known that exact line searches along each steepest descent direction may converge very slowly. An important result was given by Barzilar and Borwein, which is proved to be superlinearly convergent for convex quadratic in two dimensional space, and performs quite well for high dimensional problems. The BB method is not monotone, thus it is not easy to be generalized for general nonlinear functions unless certain non-monotone techniques being applied. Therefore, it is very desirable to find stepsize formulae which enable fast convergence and possess the monotone property. Such a stepsize $\alpha_k$ for the steepest descent method is suggested in this paper. An algorithm with this new stepsize in even iterations and exact line search in odd iterations is proposed. Numerical results are presented, which confirm that the new method can find the exact solution within 3 iteration for two dimensional problems. The new method is very efficient for small scale problems. A modified version of the new method is also presented, where the new technique for selecting the stepsize is used after every two exact line searches. The modified algorithm is comparable to the Barzilar-Borwein method for large scale problems and better for small scale problems.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/2006-JCM-8741

Journal of Computational Mathematics, Vol. 24 (2006), Iss. 2 : pp. 149–156

Published online:    2006-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    8

Keywords:    Steepest descent Line search Unconstrained optimization Convergence.