Year: 2022
Author: Hongzhi Tong, Michael Ng
Annals of Applied Mathematics, Vol. 38 (2022), Iss. 3 : pp. 280–295
Abstract
We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces. In the algorithm, we use an early stopping technique, instead of the classical Tikhonov regularization, to prevent the iteration from an overfitting function. Under mild conditions, we obtain upper bounds, essentially matching the known minimax lower bounds, for excess prediction risk. An almost sure convergence is also established for the proposed algorithm.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/aam.OA-2021-0016
Annals of Applied Mathematics, Vol. 38 (2022), Iss. 3 : pp. 280–295
Published online: 2022-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 16
Keywords: Gradient iteration algorithm functional linear regression reproducing kernel Hilbert space early stopping convergence rates.
Author Details
-
On the convergence of gradient descent for robust functional linear regression
Wang, Cheng
Fan, Jun
Journal of Complexity, Vol. 84 (2024), Iss. P.101858
https://doi.org/10.1016/j.jco.2024.101858 [Citations: 0]