Convergence Analysis of a Locally Accelerated Preconditioned Steepest Descent Method for Hermitian-Definite Generalized Eigenvalue Problems

Convergence Analysis of a Locally Accelerated Preconditioned Steepest Descent Method for Hermitian-Definite Generalized Eigenvalue Problems

Year:    2018

Author:    Yunfeng Cai, Zhaojun Bai, John E. Pask, N. Sukumar

Journal of Computational Mathematics, Vol. 36 (2018), Iss. 5 : pp. 739–760

Abstract

By extending the classical analysis techniques due to Samokish, Faddeev and Faddeeva, and Longsine and McCormick among others, we prove the convergence of the preconditioned steepest descent with implicit deflation (PSD-id) method for solving Hermitian-definite generalized eigenvalue problems. Furthermore, we derive a nonasymptotic estimate of the rate of convergence of the PSD-id method. We show that with a proper choice of the shift, the indefinite shift-and-invert preconditioner is a locally accelerated preconditioner, and is asymptotically optimal that leads to superlinear convergence. Numerical examples are presented to verify the theoretical results on the convergence behavior of the PSD-id method for solving ill-conditioned Hermitian-definite generalized eigenvalue problems arising from electronic structure calculations. While rigorous and full-scale convergence proofs of the preconditioned block steepest descent methods in practical use still largely elude us, we believe the theoretical results presented in this paper shed light on an improved understanding of the convergence behavior of these block methods.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jcm.1703-m2016-0580

Journal of Computational Mathematics, Vol. 36 (2018), Iss. 5 : pp. 739–760

Published online:    2018-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    22

Keywords:    Eigenvalue problem Steepest descent method Preconditioning Superlinear convergence.

Author Details

Yunfeng Cai

Zhaojun Bai

John E. Pask

N. Sukumar

  1. A Two-Level Preconditioned Helmholtz Subspace Iterative Method for Maxwell Eigenvalue Problems

    Liang, Qigang | Xu, Xuejun

    SIAM Journal on Numerical Analysis, Vol. 61 (2023), Iss. 2 P.642

    https://doi.org/10.1137/21M1392012 [Citations: 0]
  2. Convergence analysis of a block preconditioned steepest descent eigensolver with implicit deflation

    Zhou, Ming | Bai, Zhaojun | Cai, Yunfeng | Neymeyr, Klaus

    Numerical Linear Algebra with Applications, Vol. 30 (2023), Iss. 5

    https://doi.org/10.1002/nla.2498 [Citations: 0]
  3. Applied Mathematics

    Some Unconstrained Optimization Methods

    S. Djordjevic, Snezana

    2019

    https://doi.org/10.5772/intechopen.83679 [Citations: 3]