Unsupervised Adaptive Weight Pruning for Energy-Efficient Neuromorphic Systems
- PMID: 33281549
- PMCID: PMC7689062
- DOI: 10.3389/fnins.2020.598876
Unsupervised Adaptive Weight Pruning for Energy-Efficient Neuromorphic Systems
Abstract
To tackle real-world challenges, deep and complex neural networks are generally used with a massive number of parameters, which require large memory size, extensive computational operations, and high energy consumption in neuromorphic hardware systems. In this work, we propose an unsupervised online adaptive weight pruning method that dynamically removes non-critical weights from a spiking neural network (SNN) to reduce network complexity and improve energy efficiency. The adaptive pruning method explores neural dynamics and firing activity of SNNs and adapts the pruning threshold over time and neurons during training. The proposed adaptation scheme allows the network to effectively identify critical weights associated with each neuron by changing the pruning threshold dynamically over time and neurons. It balances the connection strength of neurons with the previous layer with adaptive thresholds and prevents weak neurons from failure after pruning. We also evaluated improvement in the energy efficiency of SNNs with our method by computing synaptic operations (SOPs). Simulation results and detailed analyses have revealed that applying adaptation in the pruning threshold can significantly improve network performance and reduce the number of SOPs. The pruned SNN with 800 excitatory neurons can achieve a 30% reduction in SOPs during training and a 55% reduction during inference, with only 0.44% accuracy loss on MNIST dataset. Compared with a previously reported online soft pruning method, the proposed adaptive pruning method shows 3.33% higher classification accuracy and 67% more reduction in SOPs. The effectiveness of our method was confirmed on different datasets and for different network sizes. Our evaluation showed that the implementation overhead of the adaptive method regarding speed, area, and energy is negligible in the network. Therefore, this work offers a promising solution for effective network compression and building highly energy-efficient neuromorphic systems in real-time applications.
Keywords: STDP; neuromorphic computing; pattern recognition; pruning; spiking neural networks; unsupervised learning.
Copyright © 2020 Guo, Fouda, Yantir, Eltawil and Salama.
Figures















Similar articles
-
Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems.Front Neurosci. 2021 Mar 4;15:638474. doi: 10.3389/fnins.2021.638474. eCollection 2021. Front Neurosci. 2021. PMID: 33746705 Free PMC article.
-
A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications.Front Neurosci. 2019 Apr 26;13:405. doi: 10.3389/fnins.2019.00405. eCollection 2019. Front Neurosci. 2019. PMID: 31080402 Free PMC article.
-
Neuron pruning in temporal domain for energy efficient SNN processor design.Front Neurosci. 2023 Nov 30;17:1285914. doi: 10.3389/fnins.2023.1285914. eCollection 2023. Front Neurosci. 2023. PMID: 38099202 Free PMC article.
-
Deep Learning With Spiking Neurons: Opportunities and Challenges.Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018. Front Neurosci. 2018. PMID: 30410432 Free PMC article. Review.
-
Direct training high-performance deep spiking neural networks: a review of theories and methods.Front Neurosci. 2024 Jul 31;18:1383844. doi: 10.3389/fnins.2024.1383844. eCollection 2024. Front Neurosci. 2024. PMID: 39145295 Free PMC article. Review.
Cited by
-
A flexible capacitive photoreceptor for the biomimetic retina.Light Sci Appl. 2022 Jan 1;11(1):3. doi: 10.1038/s41377-021-00686-4. Light Sci Appl. 2022. PMID: 34974516 Free PMC article.
-
Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networks.Front Neurosci. 2023 Jul 31;17:1230002. doi: 10.3389/fnins.2023.1230002. eCollection 2023. Front Neurosci. 2023. PMID: 37583415 Free PMC article.
-
Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems.Front Neurosci. 2021 Mar 4;15:638474. doi: 10.3389/fnins.2021.638474. eCollection 2021. Front Neurosci. 2021. PMID: 33746705 Free PMC article.
-
Real-time execution of SNN models with synaptic plasticity for handwritten digit recognition on SIMD hardware.Front Neurosci. 2024 Aug 6;18:1425861. doi: 10.3389/fnins.2024.1425861. eCollection 2024. Front Neurosci. 2024. PMID: 39165339 Free PMC article.
-
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning.Front Neurosci. 2022 Apr 14;16:760298. doi: 10.3389/fnins.2022.760298. eCollection 2022. Front Neurosci. 2022. PMID: 35495028 Free PMC article.
References
-
- Anwar S., Hwang K., Sung W. (2017). Structured pruning of deep convolutional neural networks. J. Emerg. Technol. Comput. Syst. 13:32 10.1145/3005348 - DOI
-
- Azarian K., Bhalgat Y., Lee J., Blankevoort T. (2020). Learned threshold pruning. ArXiv [Preprint]. Available online at: https://arxiv.org/pdf/2003.00075.pdf (accessed October 4, 2020).
-
- Davies M., Srinivasa N., Lin T., Chinya G., Cao Y., Choday S. H., et al. (2018). Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38 82–99. 10.1109/MM.2018.112130359 - DOI
LinkOut - more resources
Full Text Sources