Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
- PMID: 29921910
- PMCID: PMC6008460
- DOI: 10.1038/s41467-018-04316-3
Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
Abstract
Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erdős-Rényi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.
Conflict of interest statement
The authors declare no competing interests.
Figures
Similar articles
-
Coherent periodic activity in excitatory Erdös-Renyi neural networks: the role of network connectivity.Chaos. 2012 Jun;22(2):023133. doi: 10.1063/1.4723839. Chaos. 2012. PMID: 22757540
-
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31. Neural Netw. 2008. PMID: 18249524
-
AAGCN: a graph convolutional neural network with adaptive feature and topology learning.Sci Rep. 2024 May 2;14(1):10134. doi: 10.1038/s41598-024-60598-2. Sci Rep. 2024. PMID: 38698098 Free PMC article.
-
Born to learn: The inspiration, progress, and future of evolved plastic artificial neural networks.Neural Netw. 2018 Dec;108:48-67. doi: 10.1016/j.neunet.2018.07.013. Epub 2018 Aug 7. Neural Netw. 2018. PMID: 30142505 Review.
-
An introduction to bio-inspired artificial neural network architectures.Acta Neurol Belg. 2003 Mar;103(1):6-12. Acta Neurol Belg. 2003. PMID: 12704977 Review.
Cited by
-
Optimization of Online Education and Teaching Evaluation System Based on GA-BP Neural Network.Comput Intell Neurosci. 2021 Aug 12;2021:8785127. doi: 10.1155/2021/8785127. eCollection 2021. Comput Intell Neurosci. 2021. Retraction in: Comput Intell Neurosci. 2023 Aug 2;2023:9795420. doi: 10.1155/2023/9795420 PMID: 34422036 Free PMC article. Retracted.
-
Dendritic normalisation improves learning in sparsely connected artificial neural networks.PLoS Comput Biol. 2021 Aug 9;17(8):e1009202. doi: 10.1371/journal.pcbi.1009202. eCollection 2021 Aug. PLoS Comput Biol. 2021. PMID: 34370727 Free PMC article.
-
Consistent Sparse Deep Learning: Theory and Computation.J Am Stat Assoc. 2022;117(540):1981-1995. doi: 10.1080/01621459.2021.1895175. Epub 2021 Apr 20. J Am Stat Assoc. 2022. PMID: 36945326 Free PMC article.
-
BioGD: Bio-inspired robust gradient descent.PLoS One. 2019 Jul 5;14(7):e0219004. doi: 10.1371/journal.pone.0219004. eCollection 2019. PLoS One. 2019. PMID: 31276469 Free PMC article.
-
Rolling Bearing Fault Diagnosis Based on VMD-MPE and PSO-SVM.Entropy (Basel). 2021 Jun 16;23(6):762. doi: 10.3390/e23060762. Entropy (Basel). 2021. PMID: 34208777 Free PMC article.
References
LinkOut - more resources
Full Text Sources
Other Literature Sources