Hardware Spiking Neural Networks with Pair-Based STDP Using Stochastic Computing | Neural Processing Letters Skip to main content
Log in

Hardware Spiking Neural Networks with Pair-Based STDP Using Stochastic Computing

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Spiking Neural Networks (SNNs) can closely mimic the biological neural network systems. Recently, the SNNs have been developed in hardware circuits to emulate the time encoding and information-processing aspects of the human brain in real-time. However, the hardware SNN systems are suffering from large hardware resource consumption due to the high complexity of computational units. In this paper, a novel hardware SNN system based on stochastic computing is proposed to address this problem. Pair-based spiking-timing-dependent plasticity, coupled with integrate-and-fire neurons are employed to design the SNN. Stochastic computing can simplify the computational components of multipliers, adders, and subtractors in conventional hardware SNNs, hence reduce the hardware resource cost. Experimental results show that compared with the state-of-the-art approaches the proposed SNN system reduces the resource consumption by 58.0% (especially registers by \(\ge \)65.6%). In the meantime, the maximum normalized root mean square error between the proposed hardware and others is only 0.0097, which can maintain the behaviours of SNN. This work provides a beneficial alternative to the large-scale hardware SNN implementations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Liu J, Huang Y, Luo Y, Harkin J, McDaid L (2019) Bio-inspired fault detection circuits based on synapse and spiking neuron models. Neurocomputing 331(1):473–482

    Article  Google Scholar 

  2. Auge D, Hille J, Mueller E, Knoll A (2021) A survey of encoding techniques for signal processing in spiking neural networks. Neural Process Lett 5(5):1–18

    Google Scholar 

  3. Luo Y, Wan L, Liu J, Harkin J, Cao Y (2018) An efficient, low-cost routing architecture for spiking neural network hardware implementations. Neural Process Lett 48(3):1777–1788

    Article  Google Scholar 

  4. Singh AK, Saraswat V, Baghini MS, Ganguly U (2022) Quantum tunneling based ultra-compact and energy efficient spiking neuron enables hardware snn. IEEE Trans Circuits Syst I Regul Pap 13(6):1–13

    Google Scholar 

  5. Morrison A, Diesmann M, Gerstner W (2008) Phenomenological models of synaptic plasticity based on spike timing. Biol. Cybern. 98(6):459–478

    Article  MathSciNet  MATH  Google Scholar 

  6. Quintana FM, Perez-Pena F, Galindo PL (2022) Bio-plausible digital implementation of a reward modulated stdp synapse. Neural Comput Appl 1(1):1–12

    Google Scholar 

  7. Daddinounou S, Vatajelu EI (2022) Synaptic control for hardware implementation of spike timing dependent plasticity. In: International symposium on design and diagnostics of electronic circuits and systems (DDECS), pp 106–111

  8. Liu J, Lu H, Luo Y, Yang S (2021) Spiking neural network-based multi-task autonomous learning for mobile robots. Eng Appl Artif Intell 104(104):362

    Google Scholar 

  9. Tavanaei A, Maida A (2019) Bp-stdp: approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330:39–47

    Article  Google Scholar 

  10. Peterson DG, Nawarathne T, Leung H (2022) Modulating stdp with back-propagated error signals to train snns for audio classification. IEEE Trans Emerg Topics Comput Intell 5(1):1–12

    Google Scholar 

  11. Pani D, Meloni P, Tuveri G, Palumbo F, Massobrio P, Raffo L (2017) An FPGA platform for real-time simulation of spiking neuronal networks. Front. Neurosci. 11(2):90–103

    Google Scholar 

  12. Neil D, Liu SC (2014) Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Trans Very Large Scale Integr Syst 22(12):2621–2628

    Article  Google Scholar 

  13. Wijesinghe P, Ankit A, Sengupta A, Roy K (2018) An all-memristor deep spiking neural computing system: a step toward realizing the low-power stochastic brain. IEEE Trans Emerg Top Comput Intell 2(5):345–358

    Article  Google Scholar 

  14. Babacan Y, Yesil A, Tozlu OF, Kacar F (2022) Investigation of stdp mechanisms for memristor circuits. AEU Int J Electron Commun 151(1):154–230

    Google Scholar 

  15. Wang R, Thakur CS, Hamilton TJ, Tapson J, van Schaik A (2016) A stochastic approach to STDP. In: International Symposium on Circuits and Systems, pp 2082–2085

  16. Gomar S, Ahmadi M (2018) Digital realization of PSTDP and TSTDP learning. In: International Joint Conference Neural Networks, pp 1–5

  17. Heidarpur M, Ahmadi A, Ahmadi M, Rahimi Azghadi M (2019) CORDIC-SNN: on-FPGA STDP learning with Izhikevich neurons. IEEE Trans Circuits Syst I Regul Pap 66(7):2651–2661

    Article  Google Scholar 

  18. Lammie C, Hamilton TJ, van Schaik A, Rahimi Azghadi M (2019) Efficient FPGA implementations of pair and triplet-based STDP for neuromorphic architectures. IEEE Trans Circuits Syst I Regul Pap 66(4):1558–1570

    Article  Google Scholar 

  19. Sartori J, Kumar R (2011) Stochastic computing. Found Trends Electron Des Autom 5(3):153–210

    Article  Google Scholar 

  20. Brown BD, Card HC (2001) Stochastic neural computation I: computational elements. IEEE Trans Comput 50(9):891–905

    Article  MathSciNet  MATH  Google Scholar 

  21. Li P, Lilja DJ (2011) Using stochastic computing to implement digital image processing algorithms. In: International on conference computer design, pp 154–161

  22. Sarkis G, Hemati S, Mannor S, Gross WJ (2013) Stochastic decoding of LDPC codes over GF(q). IEEE Trans Commun 61(3):939–950

    Article  Google Scholar 

  23. Canals V, Morro A, Oliver A, Alomar ML (2016) A new stochastic computing methodology for efficient neural network implementation. IEEE Trans Neural Networks Learn Syst 27(3):551–564

    Article  MathSciNet  Google Scholar 

  24. Nguyen DA, Ho HH, Bui DH, Tran XT (2018) An efficient hardware implementation of artificial neural network based on stochastic computing. In: Conf. Information and Computer Science pp 237–242

  25. Lee C, Panda P, Srinivasan G, Roy K (2018) Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front Neurosci 12:435

    Article  Google Scholar 

  26. Yang Z, Huang Y, Zhu J, Ye TT (2020) Analog circuit implementation of lif and stdp models for spiking neural networks. In: Proceedings of the 2020 on great lakes symposium on VLSI, pp 469–474

  27. Panwar N, Rajendran B, Ganguly U (2017) Arbitrary spike time dependent plasticity (stdp) in memristor by analog waveform engineering. IEEE Electron Device Lett 38(6):740–743

    Article  Google Scholar 

  28. Ismail AA, Shaheen ZA, Rashad O, Salama KN, Mostafa H (2018) A low power hardware implementation of izhikevich neuron using stochastic computing. In: 2018 30th international conference on microelectronics (ICM), pp 315–318. IEEE

  29. Zhang G, Li B, Wu J, Wang R, Lan Y, Sun L, Lei S, Li H, Chen Y (2020) A low-cost and high-speed hardware implementation of spiking neural network. Neurocomputing 382(1):106–115

    Article  Google Scholar 

  30. Farsa EZ, Ahmadi A, Maleki MA, Gholami M, Rad HN (2019) A low-cost high-speed neuromorphic hardware based on spiking neural network. IEEE Trans Circuits Syst II Express Briefs 66(9):1582–1586

    Google Scholar 

  31. Akbarzadeh-Sherbaf K, Safari S, Vahabie AH (2020) A digital hardware implementation of spiking neural networks with binary FORCE training. Neurocomputing 412(1):129–142

    Article  Google Scholar 

  32. Guo W, Yantir HE, Fouda ME, Eltawil AM, Salama KN (2021) Toward the optimal design and FPGA implementation of spiking neural networks. IEEE Trans Neural Netw Learn Syst 24(6):1–15

    Google Scholar 

  33. Wu J, Zhan Y, Peng Z, Ji X, Yu G, Zhao R, Wang C (2021) Efficient design of spiking neural network with stdp learning based on fast cordic. IEEE Trans Circuits Syst I Regular Pap 68(6):2522–2534

    Article  Google Scholar 

  34. L Wan, Y Luo, S Song, J Harkin, J Liu (2016) Efficient neuron architecture for FPGA-based spiking neural networks. In: Signals and Systems Conference, pp 1–6

  35. Liu J, Harkin J, Maguire LP, McDaid LJ, Wade JJ (2018) SPANNER: a self-repairing spiking neural network hardware architecture. IEEE Trans Neural Netw Learn Syst 29(4):1287–1300

    Article  Google Scholar 

  36. Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117(4):500–544

    Article  Google Scholar 

  37. Izhikevich EM (2004) Which model to use for cortical spiking neurons? IEEE Trans Neural Netw 15(5):1063–1070

    Article  Google Scholar 

  38. Brunel N, Hakim V (1999) Fast global oscillations in networks of integrate-and-fire neurons with low firing rates. Neural Comput 11(7):1621–1671

    Article  Google Scholar 

  39. Destexhe A, Mainen ZF, Sejnowski TJ (1994) Synthesis of models for excitable membranes, synaptic transmission and neuromodulation using a common kinetic formalism. J Comput Neurosci 1(3):195–230

    Article  Google Scholar 

  40. Liu J, Harkin J, Maguire L, McDaid L, Wade J, McElholm M (2016) Self-repairing hardware with astrocyte-neuron networks. In: International symposium on circuits and systems, pp 1350–1353

  41. Liu J, Liang Z, Luo Y, Huang J, Yang S (2019) Hardware tripartite synapse architecture based on stochastic computing. In: International symposium on theoretical aspects of software engineering, pp 81–85

  42. Pfister JP (2006) Triplets of spikes in a model of spike timing-dependent plasticity. J Neurosci 26(38):9673–9682

    Article  Google Scholar 

  43. Froemke RC, Dan Y (2002) Spike-timing-dependent synaptic modification induced by natural spike trains. Nature 416(6879):433–438

    Article  Google Scholar 

  44. Gjorgjieva J, Clopath C, Audet J, Pfister JP (2011) A triplet spike-timing-dependent plasticity model generalizes the bienenstock-cooper-munro rule to higher-order spatiotemporal correlations. Proc Natl Acad Sci USA 108(48):19383–19388

    Article  Google Scholar 

  45. Wang HX, Gerkin RC, Nauen DW, Bi GQ (2005) Coactivation and timing-dependent integration of synaptic potentiation and depression. Nat Neurosci 8(2):187–193

    Article  Google Scholar 

  46. Alaghi A, Qian W, Hayes JP (2018) The promise and challenge of stochastic computing. IEEE Trans Comput Des Integr Circuits Syst 37(8):1515–1531

    Article  Google Scholar 

  47. Alaghi A, Hayes JP (2013) Survey of stochastic computing. ACM Trans Embed Comput Syst 12(2):1–19

    Article  Google Scholar 

  48. Hahn GD (1991) A modified Euler method for dynamic analyses. Int J Numer Methods Eng 32(5):943–955

    Article  MATH  Google Scholar 

  49. Nouri M, Jalilian M, Hayati M, Abbott D (2017) A digital neuromorphic realization of pair-based and triplet-based spike-timing-dependent synaptic plasticity. IEEE Trans Circuits Syst II Express Briefs 65(6):804–808

    Google Scholar 

  50. Azghadi MR, Al-Sarawi S, Iannella, N, Abbott D (2012) Efficient design of triplet based spike-timing dependent plasticity. In: The 2012 International joint conference on neural networks (IJCNN), pp 1–7

  51. Çağdaş S, Şengör NS (2022) A folded architecture for hardware implementation of a neural structure using izhikevich model. In: Pimenidis E, Angelov P, Jayne C, Papaleonidas A, Aydin M (eds.) Artificial neural networks and machine learning – ICANN 2022, pp 508–518. Springer Nature Switzerland

  52. Guo W, Fouda ME, Eltawil AM, Salama KN (2022) Efficient hardware implementation for online local learning in spiking neural networks. In: 2022 IEEE 4th international conference on artificial intelligence circuits and systems (AICAS), pp 387–390

Download references

Acknowledgements

This research is supported by the National Natural Science Foundation of China under Grant 61976063, the Guangxi Natural Science Foundation under Grant 2022GXNSFFA035028, research fund of Guangxi Normal University under Grant 2021JC006, the AI+Education research project of Guangxi Humanities Society Science Development Research Center under Grant ZXZJ202205.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuling Luo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, J., Wang, Y., Luo, Y. et al. Hardware Spiking Neural Networks with Pair-Based STDP Using Stochastic Computing. Neural Process Lett 55, 7155–7173 (2023). https://doi.org/10.1007/s11063-023-11255-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-023-11255-8

Keywords

Navigation