Loading web-font TeX/Math/Italic
Data Symmetries and Learning in Fully Connected Neural Networks | IEEE Journals & Magazine | IEEE Xplore

Data Symmetries and Learning in Fully Connected Neural Networks


Symmetries of the data (translation, rotation scale and flip) are reflected into the learned weights of a neural network.

Abstract:

Symmetries in the data and how they constrain the learned weights of modern deep networks is still an open problem. In this work we study the simple case of fully connect...Show More

Abstract:

Symmetries in the data and how they constrain the learned weights of modern deep networks is still an open problem. In this work we study the simple case of fully connected shallow non-linear neural networks and consider two types of symmetries: full dataset symmetries where the dataset X is mapped into itself by any transformation g , i.e. gX=X or single data point symmetries where gx=x , x\in X . We prove and experimentally confirm that symmetries in the data are directly inherited at the level of the network’s learned weights and relate these findings with the common practice of data augmentation in modern machine learning. Finally, we show how symmetry constraints have a profound impact on the spectrum of the learned weights, an aspect of the so-called network implicit bias.
Symmetries of the data (translation, rotation scale and flip) are reflected into the learned weights of a neural network.
Published in: IEEE Access ( Volume: 11)
Page(s): 47282 - 47290
Date of Publication: 10 May 2023
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.