Modularizing Deep Learning via Pairwise Learning With Kernels
- PMID: 33400656
- DOI: 10.1109/TNNLS.2020.3042346
Modularizing Deep Learning via Pairwise Learning With Kernels
Abstract
By redefining the conventional notions of layers, we present an alternative view on finitely wide, fully trainable deep neural networks as stacked linear models in feature spaces, leading to a kernel machine interpretation. Based on this construction, we then propose a provably optimal modular learning framework for classification that does not require between-module backpropagation. This modular approach brings new insights into the label requirement of deep learning (DL). It leverages only implicit pairwise labels (weak supervision) when learning the hidden modules. When training the output module, on the other hand, it requires full supervision but achieves high label efficiency, needing as few as ten randomly selected labeled examples (one from each class) to achieve 94.88% accuracy on CIFAR-10 using a ResNet-18 backbone. Moreover, modular training enables fully modularized DL workflows, which then simplify the design and implementation of pipelines and improve the maintainability and reusability of models. To showcase the advantages of such a modularized workflow, we describe a simple yet reliable method for estimating reusability of pretrained modules as well as task transferability in a transfer learning setting. At practically no computation overhead, it precisely described the task space structure of 15 binary classification tasks from CIFAR-10.
Similar articles
-
The Role of Knowledge Creation-Oriented Convolutional Neural Network in Learning Interaction.Comput Intell Neurosci. 2022 Mar 16;2022:6493311. doi: 10.1155/2022/6493311. eCollection 2022. Comput Intell Neurosci. 2022. PMID: 35341199 Free PMC article.
-
Targeted transfer learning to improve performance in small medical physics datasets.Med Phys. 2020 Dec;47(12):6246-6256. doi: 10.1002/mp.14507. Epub 2020 Oct 25. Med Phys. 2020. PMID: 33007112
-
Biologically motivated learning method for deep neural networks using hierarchical competitive learning.Neural Netw. 2021 Dec;144:271-278. doi: 10.1016/j.neunet.2021.08.027. Epub 2021 Sep 3. Neural Netw. 2021. PMID: 34520937
-
Deep learning for electroencephalogram (EEG) classification tasks: a review.J Neural Eng. 2019 Jun;16(3):031001. doi: 10.1088/1741-2552/ab0ab5. Epub 2019 Feb 26. J Neural Eng. 2019. PMID: 30808014 Review.
-
Deep learning in spiking neural networks.Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18. Neural Netw. 2019. PMID: 30682710 Review.
Cited by
-
Distinctive properties of biological neural networks and recent advances in bottom-up approaches toward a better biologically plausible neural network.Front Comput Neurosci. 2023 Jun 28;17:1092185. doi: 10.3389/fncom.2023.1092185. eCollection 2023. Front Comput Neurosci. 2023. PMID: 37449083 Free PMC article. Review.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources