Year: 2023
Author: Kai Jiang, Juan Zhang, Qi Zhou
CSIAM Transactions on Applied Mathematics, Vol. 4 (2023), Iss. 4 : pp. 672–695
Abstract
Matrix splitting iteration methods play a vital role in solving large sparse linear systems. Their performance heavily depends on the splitting parameters, however, the approach of selecting optimal splitting parameters has not been well developed. In this paper, we present a multitask kernel-learning parameter prediction method to automatically obtain relatively optimal splitting parameters, which contains simultaneous multiple parameters prediction and a data-driven kernel learning. For solving time-dependent linear systems, including linear differential systems and linear matrix systems, we give a new matrix splitting Kronecker product method, as well as its convergence analysis and preconditioning strategy. Numerical results illustrate our methods can save an enormous amount of time in selecting the relatively optimal splitting parameters compared with the exists methods. Moreover, our iteration method as a preconditioner can effectively accelerate GMRES. As the dimension of systems increases, all the advantages of our approaches become significantly. Especially, for solving the differential Sylvester matrix equation, the speedup ratio can reach tens to hundreds of times when the scale of the system is larger than one hundred thousand.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/csiam-am.SO-2022-0049
CSIAM Transactions on Applied Mathematics, Vol. 4 (2023), Iss. 4 : pp. 672–695
Published online: 2023-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 24
Keywords: Multitask kernel-learning parameter prediction time-dependent linear systems matrix splitting Kronecker product method convergence analysis preconditioning.