Year: 2022
Author: Jingrun Chen, Shi Jin, Liyao Lyu
Communications in Computational Physics, Vol. 31 (2022), Iss. 4 : pp. 1296–1316
Abstract
Objective functions in large-scale machine-learning and artificial intelligence
applications often live in high dimensions with strong non-convexity and massive
local minima. Gradient-based methods, such as the stochastic gradient method and
Adam [15], and gradient-free methods, such as the consensus-based optimization (CBO)
method, can be employed to find minima. In this work, based on the CBO method and
Adam, we propose a consensus-based global optimization method with adaptive momentum estimation (Adam-CBO). Advantages of the Adam-CBO method include:
• It is capable of finding global minima of non-convex objective functions with
high success rates and low costs. This is verified by finding the global minimizer
of the 1000 dimensional Rastrigin function with 100% success rate at a cost only
growing linearly with respect to the dimensionality.
• It can handle non-differentiable activation functions and thus approximate low-regularity functions with better accuracy. This is confirmed by solving a machine learning task for partial differential equations with low-regularity solutions
where the Adam-CBO method provides better results than Adam.
• It is robust in the sense that its convergence is insensitive to the learning rate by a
linear stability analysis. This is confirmed by finding the minimizer of a quadratic
function.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/cicp.OA-2021-0144
Communications in Computational Physics, Vol. 31 (2022), Iss. 4 : pp. 1296–1316
Published online: 2022-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 21
Keywords: Consensus-based optimization global optimization machine learning curse of dimensionality.
Author Details
-
Kinetic-based optimization enhanced by genetic dynamics
Albi, Giacomo | Ferrarese, Federica | Totzeck, ClaudiaMathematical Models and Methods in Applied Sciences, Vol. 33 (2023), Iss. 14 P.2905
https://doi.org/10.1142/S0218202523500641 [Citations: 1] -
Consensus based optimization with memory effects: Random selection and applications
Borghi, Giacomo | Grassi, Sara | Pareschi, LorenzoChaos, Solitons & Fractals, Vol. 174 (2023), Iss. P.113859
https://doi.org/10.1016/j.chaos.2023.113859 [Citations: 2] -
Consensus-Based Optimization Methods Converge Globally
Fornasier, Massimo | Klock, Timo | Riedl, KonstantinSIAM Journal on Optimization, Vol. 34 (2024), Iss. 3 P.2973
https://doi.org/10.1137/22M1527805 [Citations: 2] -
A Gastrointestinal Image Classification Method Based on Improved Adam Algorithm
Sun, Haijing | Cui, Jiaqi | Shao, Yichuan | Yang, Jiapeng | Xing, Lei | Zhao, Qian | Zhang, LeMathematics, Vol. 12 (2024), Iss. 16 P.2452
https://doi.org/10.3390/math12162452 [Citations: 0] -
Constrained Consensus-Based Optimization
Borghi, Giacomo | Herty, Michael | Pareschi, LorenzoSIAM Journal on Optimization, Vol. 33 (2023), Iss. 1 P.211
https://doi.org/10.1137/22M1471304 [Citations: 10] -
Time-discrete momentum consensus-based optimization algorithm and its application to Lyapunov function approximation
Ha, Seung-Yeal | Hwang, Gyuyoung | Kim, SungyoonMathematical Models and Methods in Applied Sciences, Vol. 34 (2024), Iss. 06 P.1153
https://doi.org/10.1142/S0218202524400104 [Citations: 1]