An Accelerated Preconditioned Primal-Dual Gradient Algorithm for Nonconvex Composite Optimization Problems with Applications
Abstract
In this paper, we consider a class of three-composite nonconvex optimization problems, in which the nonsmooth function is further composed with a linear operator. This problem has many applications such as sparse signal recovery, image processing and machine learning. Based on the conjugate duality theory, we present an accelerated preconditioned primal-dual gradient algorithm for this problem. Compared with the existing algorithms, our algorithm only needs to calculate the proximal mapping of the conjugate function $h^∗$ which is always convex and lower semicontinuous and it does not need to calculate the proximal mapping of nonconvex functions. This may significantly reduce the computation load. We prove that the sequence generated by the proposed algorithm globally converges to a critical point when the function satisfies the Kurdyka- Lojasiewicz property. We also obtain the convergence rate of the proposed algorithm. Finally, numerical results on sparse signal recovery and image processing illustrate the efficiency and competitiveness of the proposed algorithm.
About this article
How to Cite
An Accelerated Preconditioned Primal-Dual Gradient Algorithm for Nonconvex Composite Optimization Problems with Applications. (2026). Journal of Computational Mathematics. https://doi.org/10.4208/jcm.2512-m2024-0107