Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
- PMID: 29921910
- PMCID: PMC6008460
- DOI: 10.1038/s41467-018-04316-3
Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science
Abstract
Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erdős-Rényi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.
Conflict of interest statement
The authors declare no competing interests.
Figures








Similar articles
-
Coherent periodic activity in excitatory Erdös-Renyi neural networks: the role of network connectivity.Chaos. 2012 Jun;22(2):023133. doi: 10.1063/1.4723839. Chaos. 2012. PMID: 22757540
-
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31. Neural Netw. 2008. PMID: 18249524
-
AAGCN: a graph convolutional neural network with adaptive feature and topology learning.Sci Rep. 2024 May 2;14(1):10134. doi: 10.1038/s41598-024-60598-2. Sci Rep. 2024. PMID: 38698098 Free PMC article.
-
Born to learn: The inspiration, progress, and future of evolved plastic artificial neural networks.Neural Netw. 2018 Dec;108:48-67. doi: 10.1016/j.neunet.2018.07.013. Epub 2018 Aug 7. Neural Netw. 2018. PMID: 30142505 Review.
-
An introduction to bio-inspired artificial neural network architectures.Acta Neurol Belg. 2003 Mar;103(1):6-12. Acta Neurol Belg. 2003. PMID: 12704977 Review.
Cited by
-
A Comprehensive Diagnosis Method of Rolling Bearing Fault Based on CEEMDAN-DFA-Improved Wavelet Threshold Function and QPSO-MPE-SVM.Entropy (Basel). 2021 Aug 31;23(9):1142. doi: 10.3390/e23091142. Entropy (Basel). 2021. PMID: 34573767 Free PMC article.
-
Perturbation of deep autoencoder weights for model compression and classification of tabular data.Neural Netw. 2022 Dec;156:160-169. doi: 10.1016/j.neunet.2022.09.020. Epub 2022 Sep 27. Neural Netw. 2022. PMID: 36270199 Free PMC article.
-
Identification of 12 cancer types through genome deep learning.Sci Rep. 2019 Nov 21;9(1):17256. doi: 10.1038/s41598-019-53989-3. Sci Rep. 2019. PMID: 31754222 Free PMC article.
-
Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops.Nat Commun. 2021 Aug 27;12(1):5164. doi: 10.1038/s41467-021-25427-4. Nat Commun. 2021. PMID: 34453053 Free PMC article.
-
ARNS: Adaptive Relay-Node Selection Method for Message Broadcasting in the Internet of Vehicles.Sensors (Basel). 2020 Feb 29;20(5):1338. doi: 10.3390/s20051338. Sensors (Basel). 2020. PMID: 32121445 Free PMC article.
References
LinkOut - more resources
Full Text Sources
Other Literature Sources