SciTePress - Publication Details
loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Alessio Ansuini 1 ; Eric Medvet 2 ; Felice Andrea Pellegrino 2 and Marco Zullich 2

Affiliations: 1 International School for Advanced Studies, Trieste, Italy ; 2 Dipartimento di Ingegneria e Architettura, Università degli Studi di Trieste, Trieste, Italy

Keyword(s): Machine Learning, Pruning, Convolutional Neural Networks, Lottery Ticket Hypothesis, Canonical Correlation Analysis, Explainable Knowledge.

Abstract: During the last few decades, artificial neural networks (ANN) have achieved an enormous success in regression and classification tasks. The empirical success has not been matched with an equally strong theoretical understanding of such models, as some of their working principles (training dynamics, generalization properties, and the structure of inner representations) still remain largely unknown. It is, for example, particularly difficult to reconcile the well known fact that ANNs achieve remarkable levels of generalization also in conditions of severe over-parametrization. In our work, we explore a recent network compression technique, called Iterative Magnitude Pruning (IMP), and apply it to convolutional neural networks (CNN). The pruned and unpruned models are compared layer-wise with Canonical Correlation Analysis (CCA). Our results show a high similarity between layers of pruned and unpruned CNNs in the first convolutional layers and in the fully-connected layer, while for the intermediate convolutional layers the similarity is significantly lower. This suggests that, although in intermediate layers representation in pruned and unpruned networks is markedly different, in the last part the fully-connected layers act as pivots, producing not only similar performances but also similar representations of the data, despite the large difference in the number of parameters involved. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 8.209.245.224

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Ansuini, A. ; Medvet, E. ; Pellegrino, F. and Zullich, M. (2020). On the Similarity between Hidden Layers of Pruned and Unpruned Convolutional Neural Networks. In Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM; ISBN 978-989-758-397-1; ISSN 2184-4313, SciTePress, pages 52-59. DOI: 10.5220/0008960300520059

@conference{icpram20,
author={Alessio Ansuini and Eric Medvet and Felice Andrea Pellegrino and Marco Zullich},
title={On the Similarity between Hidden Layers of Pruned and Unpruned Convolutional Neural Networks},
booktitle={Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM},
year={2020},
pages={52-59},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008960300520059},
isbn={978-989-758-397-1},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM
TI - On the Similarity between Hidden Layers of Pruned and Unpruned Convolutional Neural Networks
SN - 978-989-758-397-1
IS - 2184-4313
AU - Ansuini, A.
AU - Medvet, E.
AU - Pellegrino, F.
AU - Zullich, M.
PY - 2020
SP - 52
EP - 59
DO - 10.5220/0008960300520059
PB - SciTePress