Constructing Robust Liquid State Machines to Process Highly Variable Data Streams | SpringerLink
Skip to main content

Constructing Robust Liquid State Machines to Process Highly Variable Data Streams

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2012 (ICANN 2012)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7552))

Included in the following conference series:

  • 4274 Accesses

Abstract

In this paper, we propose a mechanism to effectively control the overall neural activity in the reservoir of a Liquid State Machine (LSM) in order to achieve both a high sensitivity of the reservoir to weak stimuli as well as an improved resistance to over-stimulation for strong inputs. The idea is to employ a mechanism that dynamically changes the firing threshold of a neuron in dependence of its spike activity. We experimentally demonstrate that reservoirs employing this neural model significantly increase their separation capabilities. We also investigate the role of dynamic and static synapses in this context. The obtained results may be very valuable for LSM based real-world application in which the input signal is often highly variable causing problems of either too little or too much network activity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 5719
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 7149
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Apolloni, B., Bassis, S., Clivio, A., Gaito, S., Malchiodi, D.: Modeling individuals aging within a bacterial population using a pi-calculus paradigm. Natural Computing 6, 33–53 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)

    Book  MATH  Google Scholar 

  3. Goodman, D., Brette, R.: Brian: A simulator for spiking neural networks in Python. BMC Neuroscience 9(suppl. 1), P92 (2008)

    Article  Google Scholar 

  4. Goodman, E., Ventura, D.: Spatiotemporal pattern recognition via liquid state machines. In: International Joint Conference on Neural Networks, Vancouver, BC, pp. 3848–3853 (2006)

    Google Scholar 

  5. Kasinski, A.J., Ponulak, F.: Comparison of supervised learning methods for spike time coding in spiking neural networks. Int. J. of Applied Mathematics and Computer Science 16, 101–113 (2006)

    MathSciNet  Google Scholar 

  6. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)

    Article  Google Scholar 

  7. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)

    Article  MATH  Google Scholar 

  8. Markram, H., Wang, Y., Tsodyks, M.: Differential signaling via the same axon of neocortical pyramidal neurons. Proceedings of the National Academy of Sciences 95(9), 5323–5328 (1998)

    Article  Google Scholar 

  9. Norton, D., Ventura, D.: Preparing more effective liquid state machines using Hebbian learning. In: International Joint Conference on Neural Networks, IJCNN 2006, pp. 4243–4248. IEEE, Vancouver (2006)

    Chapter  Google Scholar 

  10. Norton, D., Ventura, D.: Improving liquid state machines through iterative refinement of the reservoir. Neurocomputing 73(16-18), 2893–2904 (2010)

    Article  Google Scholar 

  11. Schrauwen, B., Verstraeten, D., Campenhout, J.V.: An overview of reservoir computing: theory, applications and implementations. In: Proceedings of the 15th European Symposium on Artificial Neural Networks, pp. 471–482 (2007)

    Google Scholar 

  12. Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Networks 20(3), 391–403 (2007)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Schliebs, S., Fiasché, M., Kasabov, N. (2012). Constructing Robust Liquid State Machines to Process Highly Variable Data Streams. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds) Artificial Neural Networks and Machine Learning – ICANN 2012. ICANN 2012. Lecture Notes in Computer Science, vol 7552. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33269-2_76

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33269-2_76

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33268-5

  • Online ISBN: 978-3-642-33269-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics