Two Novel Gradient Methods with Optimal Step Sizes

Two Novel Gradient Methods with Optimal Step Sizes

Year:    2021

Author:    Harry Oviedo, Oscar Dalmau, Rafael Herrera

Journal of Computational Mathematics, Vol. 39 (2021), Iss. 3 : pp. 375–391

Abstract

In this work we introduce two new Barzilai and Borwein-like steps sizes for the classical gradient method for strictly convex quadratic optimization problems. The proposed step sizes employ second-order information in order to obtain faster gradient-type methods. Both step sizes are derived from two unconstrained optimization models that involve approximate information of the Hessian of the objective function. A convergence analysis of the proposed algorithm is provided. Some numerical experiments are performed in order to compare the efficiency and effectiveness of the proposed methods with similar methods in the literature. Experimentally, it is observed that our proposals accelerate the gradient method at nearly no extra computational cost, which makes our proposal a good alternative to solve large-scale problems.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jcm.2001-m2018-0205

Journal of Computational Mathematics, Vol. 39 (2021), Iss. 3 : pp. 375–391

Published online:    2021-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    17

Keywords:    Gradient methods Convex quadratic optimization Hessian spectral properties Steplength selection.

Author Details

Harry Oviedo

Oscar Dalmau

Rafael Herrera

  1. A second-order gradient method for convex minimization

    Oviedo, Harry

    Boletín de la Sociedad Matemática Mexicana, Vol. 27 (2021), Iss. 3

    https://doi.org/10.1007/s40590-021-00375-7 [Citations: 0]
  2. An accelerated minimal gradient method with momentum for strictly convex quadratic optimization

    Oviedo, Harry | Dalmau, Oscar | Herrera, Rafael

    BIT Numerical Mathematics, Vol. 62 (2022), Iss. 2 P.591

    https://doi.org/10.1007/s10543-021-00886-9 [Citations: 2]
  3. A Dynamical Systems Approach to Machine Learning

    Pourmohammad Azizi, S. | Neisy, Abdolsadeh | Ahmad Waloo, Sajad

    International Journal of Computational Methods, Vol. 20 (2023), Iss. 09

    https://doi.org/10.1142/S021987622350007X [Citations: 1]
  4. A collection of efficient retractions for the symplectic Stiefel manifold

    Oviedo, H. | Herrera, R.

    Computational and Applied Mathematics, Vol. 42 (2023), Iss. 4

    https://doi.org/10.1007/s40314-023-02302-0 [Citations: 1]
  5. Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization

    Lara, Hugo | Aleixo, Rafael | Oviedo, Harry

    Computational Optimization and Applications, Vol. 89 (2024), Iss. 1 P.151

    https://doi.org/10.1007/s10589-024-00586-4 [Citations: 0]
  6. A cyclic delayed weighted steplength for the gradient method

    Oviedo, Harry

    Ricerche di Matematica, Vol. 73 (2024), Iss. 2 P.873

    https://doi.org/10.1007/s11587-021-00646-5 [Citations: 1]