Convergence of Backpropagation with Momentum for Network Architectures with Skip Connections

Convergence of Backpropagation with Momentum for Network Architectures with Skip Connections

Year:    2021

Author:    Chirag Agarwal, Joe Klobusicky, Dan Schonfeld

Journal of Computational Mathematics, Vol. 39 (2021), Iss. 1 : pp. 147–158

Abstract

We study a class of deep neural networks with architectures that form a directed acyclic graph (DAG). For backpropagation defined by gradient descent with adaptive momentum, we show weights converge for a large class of nonlinear activation functions. The proof generalizes the results of Wu et al. (2008) who showed convergence for a feed-forward network with one hidden layer. For an example of the effectiveness of DAG architectures, we describe an example of compression through an AutoEncoder, and compare against sequential feed-forward networks under several metrics.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jcm.1912-m2018-0279

Journal of Computational Mathematics, Vol. 39 (2021), Iss. 1 : pp. 147–158

Published online:    2021-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    12

Keywords:    Backpropagation with momentum Autoencoders Directed acyclic graphs.

Author Details

Chirag Agarwal

Joe Klobusicky

Dan Schonfeld

  1. Construction of Economic Data Management System Based on BP Neural Network

    Han, Xing

    Khan, Rahim

    Computational Intelligence and Neuroscience, Vol. 2022 (2022), Iss. P.1

    https://doi.org/10.1155/2022/9036917 [Citations: 2]