A Gradient Iteration Method for Functional Linear Regression in Reproducing Kernel Hilbert Spaces
Abstract
We consider a gradient iteration algorithm for prediction of functional linear regression under the framework of reproducing kernel Hilbert spaces. In the algorithm, we use an early stopping technique, instead of the classical Tikhonov regularization, to prevent the iteration from an overfitting function. Under mild conditions, we obtain upper bounds, essentially matching the known minimax lower bounds, for excess prediction risk. An almost sure convergence is also established for the proposed algorithm.
About this article