Year: 2022
Author: Yaoyu Zhang, Yuqing Li, Zhongwang Zhang, Tao Luo, Zhi-Qin John Xu
Journal of Machine Learning, Vol. 1 (2022), Iss. 1 : pp. 60–113
Abstract
We prove a general Embedding Principle of loss landscape of deep neural networks (NNs) that unravels a hierarchical structure of the loss landscape of NNs, i.e., loss landscape of an NN contains all critical points of all the narrower NNs. This result is obtained by constructing a class of critical embeddings which map any critical point of a narrower NN to a critical point of the target NN with the same output function. By discovering a wide class of general compatible critical embeddings, we provide a gross estimate of the dimension of critical submanifolds embedded from critical points of narrower NNs. We further prove an irreversibility property of any critical embedding that the number of negative/zero/positive eigenvalues of the Hessian matrix of a critical point may increase but never decrease as an NN becomes wider through the embedding. Using a special realization of general compatible critical embedding, we prove a stringent necessary condition for being a “truly-bad” critical point that never becomes a strict-saddle point through any critical embedding. This result implies the commonplace of strict-saddle points in wide NNs, which may be an important reason underlying the easy optimization of wide NNs widely observed in practice.
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/jml.220108
Journal of Machine Learning, Vol. 1 (2022), Iss. 1 : pp. 60–113
Published online: 2022-01
AMS Subject Headings:
Copyright: COPYRIGHT: © Global Science Press
Pages: 54
Keywords: Neural network Loss landscape Critical point Embedding principle.
Author Details
-
Implicit Regularization of Dropout
Zhang, Zhongwang
Xu, Zhi-Qin John
IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 46 (2024), Iss. 6 P.4206
https://doi.org/10.1109/TPAMI.2024.3357172 [Citations: 2]