An Adaptive Gradient Method with Energy and Momentum

An Adaptive Gradient Method with Energy and Momentum

Year:    2022

Author:    Hailiang Liu, Xuping Tian

Annals of Applied Mathematics, Vol. 38 (2022), Iss. 2 : pp. 183–222

Abstract

We introduce a novel algorithm for gradient-based optimization of stochastic objective functions. The method may be seen as a variant of SGD with momentum equipped with an adaptive learning rate automatically adjusted by an ‘energy’ variable. The method is simple to implement, computationally efficient, and well suited for large-scale machine learning problems. The method exhibits unconditional energy stability for any size of the base learning rate. We provide a regret bound on the convergence rate under the online convex optimization framework. We also establish the energy-dependent convergence rate of the algorithm to a stationary point in the stochastic non-convex setting. In addition, a sufficient condition is provided to guarantee a positive lower threshold for the energy variable. Our experiments demonstrate that the algorithm converges fast while generalizing better than or as well as SGD with momentum in training deep neural networks, and compares also favorably to Adam.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/aam.OA-2021-0095

Annals of Applied Mathematics, Vol. 38 (2022), Iss. 2 : pp. 183–222

Published online:    2022-01

AMS Subject Headings:    Global Science Press

Copyright:    COPYRIGHT: © Global Science Press

Pages:    40

Keywords:    Stochastic optimization SGD energy stability momentum.

Author Details

Hailiang Liu

Xuping Tian

  1. An Improved Medical Image Classification Algorithm Based on Adam Optimizer

    Sun, Haijing | Zhou, Wen | Yang, Jiapeng | Shao, Yichuan | Xing, Lei | Zhao, Qian | Zhang, Le

    Mathematics, Vol. 12 (2024), Iss. 16 P.2509

    https://doi.org/10.3390/math12162509 [Citations: 0]
  2. Swarm-Based Optimization with Random Descent

    Tadmor, Eitan | Zenginoğlu, Anil

    Acta Applicandae Mathematicae, Vol. 190 (2024), Iss. 1

    https://doi.org/10.1007/s10440-024-00639-0 [Citations: 0]
  3. Point Cloud Patching from Intraoral 3D Scanning Based on Dental-Net

    李, 杰

    Software Engineering and Applications, Vol. 12 (2023), Iss. 01 P.68

    https://doi.org/10.12677/SEA.2023.121008 [Citations: 0]
  4. Swarm-based gradient descent method for non-convex optimization

    Lu, Jingcheng | Tadmor, Eitan | Zenginoğlu, Anil

    Communications of the American Mathematical Society, Vol. 4 (2024), Iss. 17 P.787

    https://doi.org/10.1090/cams/42 [Citations: 0]
  5. A Linear Interpolation and Curvature-Controlled Gradient Optimization Strategy Based on Adam

    Sun, Haijing | Zhou, Wen | Shao, Yichuan | Cui, Jiaqi | Xing, Lei | Zhao, Qian | Zhang, Le

    Algorithms, Vol. 17 (2024), Iss. 5 P.185

    https://doi.org/10.3390/a17050185 [Citations: 3]
  6. Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

    Liu, Hailiang | He, Jia-Hao | Tian, Xuping

    Communications on Applied Mathematics and Computation, Vol. 6 (2024), Iss. 2 P.1299

    https://doi.org/10.1007/s42967-023-00327-0 [Citations: 0]