Hartley Spectral Pooling for Deep Learning

Hartley Spectral Pooling for Deep Learning

Year:    2020

Author:    Hao Zhang, Jianwei Ma

CSIAM Transactions on Applied Mathematics, Vol. 1 (2020), Iss. 3 : pp. 518–529

Abstract

In most convolution neural networks (CNNs), downsampling hidden layers is adopted for increasing computation efficiency and the receptive field size. Such operation is commonly called pooling. Maximization and averaging over sliding windows ($max/average$ $pooling$), and plain downsampling in the form of strided convolution are popular pooling methods. Since the pooling is a lossy procedure, a motivation of our work is to design a new pooling approach for less lossy in the dimensionality reduction. Inspired by the spectral pooling proposed by Rippel et al. [1], we present the Hartley transform based spectral pooling method. The proposed spectral pooling avoids the use of complex arithmetic for frequency representation, in comparison with Fourier pooling. The new approach preserves more structure features for network's discriminability than max and average pooling. We empirically show the Hartley pooling gives rise to the convergence of training CNNs on MNIST and CIFAR-10 datasets.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/csiam-am.2020-0018

CSIAM Transactions on Applied Mathematics, Vol. 1 (2020), Iss. 3 : pp. 518–529

Published online:    2020-01

AMS Subject Headings:    Global Science Press

Copyright:    COPYRIGHT: © Global Science Press

Pages:    12

Keywords:    Hartley transform spectral pooling deep learning.

Author Details

Hao Zhang

Jianwei Ma