Abstract
In this paper, we propose a mechanism to effectively control the overall neural activity in the reservoir of a Liquid State Machine (LSM) in order to achieve both a high sensitivity of the reservoir to weak stimuli as well as an improved resistance to over-stimulation for strong inputs. The idea is to employ a mechanism that dynamically changes the firing threshold of a neuron in dependence of its spike activity. We experimentally demonstrate that reservoirs employing this neural model significantly increase their separation capabilities. We also investigate the role of dynamic and static synapses in this context. The obtained results may be very valuable for LSM based real-world application in which the input signal is often highly variable causing problems of either too little or too much network activity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Apolloni, B., Bassis, S., Clivio, A., Gaito, S., Malchiodi, D.: Modeling individuals aging within a bacterial population using a pi-calculus paradigm. Natural Computing 6, 33–53 (2007)
Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)
Goodman, D., Brette, R.: Brian: A simulator for spiking neural networks in Python. BMC Neuroscience 9(suppl. 1), P92 (2008)
Goodman, E., Ventura, D.: Spatiotemporal pattern recognition via liquid state machines. In: International Joint Conference on Neural Networks, Vancouver, BC, pp. 3848–3853 (2006)
Kasinski, A.J., Ponulak, F.: Comparison of supervised learning methods for spike time coding in spiking neural networks. Int. J. of Applied Mathematics and Computer Science 16, 101–113 (2006)
Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)
Markram, H., Wang, Y., Tsodyks, M.: Differential signaling via the same axon of neocortical pyramidal neurons. Proceedings of the National Academy of Sciences 95(9), 5323–5328 (1998)
Norton, D., Ventura, D.: Preparing more effective liquid state machines using Hebbian learning. In: International Joint Conference on Neural Networks, IJCNN 2006, pp. 4243–4248. IEEE, Vancouver (2006)
Norton, D., Ventura, D.: Improving liquid state machines through iterative refinement of the reservoir. Neurocomputing 73(16-18), 2893–2904 (2010)
Schrauwen, B., Verstraeten, D., Campenhout, J.V.: An overview of reservoir computing: theory, applications and implementations. In: Proceedings of the 15th European Symposium on Artificial Neural Networks, pp. 471–482 (2007)
Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Networks 20(3), 391–403 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Schliebs, S., Fiasché, M., Kasabov, N. (2012). Constructing Robust Liquid State Machines to Process Highly Variable Data Streams. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds) Artificial Neural Networks and Machine Learning – ICANN 2012. ICANN 2012. Lecture Notes in Computer Science, vol 7552. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33269-2_76
Download citation
DOI: https://doi.org/10.1007/978-3-642-33269-2_76
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33268-5
Online ISBN: 978-3-642-33269-2
eBook Packages: Computer ScienceComputer Science (R0)