PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units

PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units

Year:    2020

Author:    Bo Li, Shanshan Tang, Haijun Yu

Journal of Mathematical Study, Vol. 53 (2020), Iss. 2 : pp. 159–191

Abstract

Deep neural network with rectified linear units (ReLU) is getting more and more popular recently. However, the derivatives of the function represented by a ReLU network are not continuous, which limit the usage of ReLU network to situations only when smoothness is not required. In this paper, we construct deep neural networks with rectified power units (RePU), which can give better approximations for smooth functions. Optimal algorithms are proposed to explicitly build neural networks with sparsely connected RePUs, which we call PowerNets, to represent polynomials with no approximation error. For general smooth functions, we first project the function to their polynomial approximations, then use the proposed algorithms to construct corresponding PowerNets. Thus, the error of best polynomial approximation provides an upper bound of the best RePU network approximation error. For smooth functions in higher dimensional Sobolev spaces, we use fast spectral transforms for tensor-product grid and sparse grid discretization to get polynomial approximations. Our constructive algorithms show clearly a close connection between spectral methods and deep neural networks: PowerNets with $n$ hidden layers can exactly represent polynomials up to degree $s^n$, where $s$ is the power of RePUs. The proposed PowerNets have potential applications in the situations where high-accuracy is desired or smoothness is required.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jms.v53n2.20.03

Journal of Mathematical Study, Vol. 53 (2020), Iss. 2 : pp. 159–191

Published online:    2020-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    33

Keywords:    Deep neural network rectified linear unit rectified power unit sparse grid PowerNet.

Author Details

Bo Li

Shanshan Tang

Haijun Yu

  1. Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations

    Belomestny, Denis | Naumov, Alexey | Puchkin, Nikita | Samsonov, Sergey

    Neural Networks, Vol. 161 (2023), Iss. P.242

    https://doi.org/10.1016/j.neunet.2023.01.035 [Citations: 7]
  2. Theoretical guarantees for neural control variates in MCMC

    Belomestny, Denis | Goldman, Artur | Naumov, Alexey | Samsonov, Sergey

    Mathematics and Computers in Simulation, Vol. 220 (2024), Iss. P.382

    https://doi.org/10.1016/j.matcom.2024.01.019 [Citations: 2]
  3. De Rham compatible Deep Neural Network FEM

    Longo, Marcello | Opschoor, Joost A.A. | Disch, Nico | Schwab, Christoph | Zech, Jakob

    Neural Networks, Vol. 165 (2023), Iss. P.721

    https://doi.org/10.1016/j.neunet.2023.06.008 [Citations: 3]
  4. ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximation

    Tang, Shanshan | Li, Bo | Yu, Haijun

    Communications in Mathematics and Statistics, Vol. (2024), Iss.

    https://doi.org/10.1007/s40304-023-00392-0 [Citations: 0]
  5. Self-learning activation functions to increase accuracy of privacy-preserving Convolutional Neural Networks with homomorphic encryption

    Pulido-Gaytan, Bernardo | Tchernykh, Andrei | J, Andrew

    PLOS ONE, Vol. 19 (2024), Iss. 7 P.e0306420

    https://doi.org/10.1371/journal.pone.0306420 [Citations: 0]
  6. Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)

    Schwab, Christoph | Zech, Jakob

    SIAM/ASA Journal on Uncertainty Quantification, Vol. 11 (2023), Iss. 1 P.199

    https://doi.org/10.1137/21M1462738 [Citations: 6]