On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

Year:    2023

Author:    Dansheng Yu, Yunyou Qian, Fengjun Li

Analysis in Theory and Applications, Vol. 39 (2023), Iss. 1 : pp. 93–104

Abstract

Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li’s neural networks.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/ata.OA-2021-0006

Analysis in Theory and Applications, Vol. 39 (2023), Iss. 1 : pp. 93–104

Published online:    2023-01

AMS Subject Headings:    Global Science Press

Copyright:    COPYRIGHT: © Global Science Press

Pages:    12

Keywords:    Approximation rate modulus of continuity modulus of smoothness neural network operators.

Author Details

Dansheng Yu

Yunyou Qian

Fengjun Li

  1. Construction and approximation rate for feedforward neural network operators with sigmoidal functions

    Yu, Dansheng

    Cao, Feilong

    Journal of Computational and Applied Mathematics, Vol. 453 (2025), Iss. P.116150

    https://doi.org/10.1016/j.cam.2024.116150 [Citations: 4]