Year: 2023
Author: Lei Wu
Journal of Machine Learning, Vol. 2 (2023), Iss. 4 : pp. 259–270
Abstract
An important problem in machine learning theory is to understand the approximation and generalization properties of two-layer neural networks in high dimensions. To this end, researchers have introduced the Barron space $\mathcal{B}_s(Ω)$ and the spectral Barron space $\mathcal{F}_s(Ω),$ where the index $s ∈ [0, ∞)$ indicates the smoothness of functions within these spaces and $Ω ⊂ \mathbb{R}^d$ denotes the input domain. However, the precise relationship between the two types of Barron spaces remains unclear. In this paper, we establish a continuous embedding between them as implied by the following inequality: For any $\delta∈ (0, 1),s ∈ \mathbb{N}^+$ and $f : Ω\mapsto \mathbb{R},$ it holds that $$\delta||f||_{\mathcal{F}_{s−\delta}(Ω)}\lesssim_s||f||_{\mathcal{B}_s(Ω)}\lesssim_s||f||_{\mathcal{F}_{s+1}(Ω).}$$ Importantly, the constants do not depend on the input dimension $d,$ suggesting that the embedding is effective in high dimensions. Moreover, we also show that the lower and upper bound are both tight.
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/jml.230530
Journal of Machine Learning, Vol. 2 (2023), Iss. 4 : pp. 259–270
Published online: 2023-01
AMS Subject Headings:
Copyright: COPYRIGHT: © Global Science Press
Pages: 12
Keywords: Barron space Two-layer neural network High-dimensional approximation Embedding theorem Fourier transform.