Abstract
Reservoir computing (RC) is a popular approach to the efficient design of recurrent neural networks (RNNs), where the dynamical part of the model is initialized and left untrained. Deep echo state networks (ESNs) combined the deep learning approach with RC, by structuring the reservoir in multiple layers, thus offering the striking advantage of encoding the input sequence on different time-scales. A key factor for the effectiveness of ESNs is the echo state property (ESP), which ensures the asymptotic stability of the reservoir dynamics. In this paper, we perform an in-depth theoretical analysis of asymptotic dynamics in Deep ESNs with different contractivity hierarchies, offering a more accurate sufficient condition of the ESP. We investigate how different hierarchies of contractivity affect memory capacity and predictive performance in regression tasks, concluding that structuring reservoir layers in decreasing contractivity is the best design choice. The results of this paper can potentially be applied also to the design of fully-trained RNNs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
More details on uniform convergence requirements can be found in [24].
- 2.
The reservoir layers \(F^{(\ell )}\) defined in Sect. 2 have state space \(\mathcal {X}_F = \mathcal {X} = [-1,1]^{N_R}\), and input space either \(\mathcal {U}_F = \mathcal {U} \subset \mathbb {R}^{N_U}\) if \(\ell = 1\) or \(\mathcal {U}_F = \mathcal {X}\) if \(\ell > 1\).
References
Doyne Farmer, J.: Chaotic attractors of an infinite-dimensional dynamical system. Physica D Nonlinear Phenom. 4(3), 366–393 (1982). https://doi.org/10.1016/0167-2789(82)90042-2
Gallicchio, C.: Short-term memory of deep RNN. In: Proceedings of the 26th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2018, pp. 633–638 (2018)
Gallicchio, C., Micheli, A.: Architectural and Markovian factors of echo state networks. Neural Netw. 24(5), 440–456 (2011)
Gallicchio, C., Micheli, A.: Echo state property of deep reservoir computing networks. Cogn. Comput. 9(3), 337–350 (2017). https://doi.org/10.1007/s12559-017-9461-9
Gallicchio, C., Micheli, A.: Reservoir topology in deep echo state networks. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11731, pp. 62–75. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30493-5_6
Gallicchio, C., Micheli, A.: Richness of deep echo state network dynamics. In: Rojas, I., Joya, G., Catala, A. (eds.) IWANN 2019. LNCS, vol. 11506, pp. 480–491. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-20521-8_40
Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017)
Gallicchio, C., Micheli, A., Pedrelli, L.: Design of deep echo state networks. Neural Netw. 108, 33–47 (2018). https://doi.org/10.1016/j.neunet.2018.08.002
Gallicchio, C., Micheli, A., Pedrelli, L.: Comparison between DeepESNs and gated RNNs on multivariate time-series prediction. In: Proceedings of the 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2019, pp. 619–624 (2019)
Gallicchio, C., Micheli, A., Pedrelli, L.: Hierarchical temporal representation in linear reservoir computing. Smart Innov. Syst. Technol. 102, 119–129 (2019). https://doi.org/10.1007/978-3-319-95098-3_11
Gallicchio, C., Micheli, A., Silvestri, L.: Local Lyapunov exponents of deep echo state networks. Neurocomputing 298, 34–45 (2018). https://doi.org/10.1016/j.neucom.2017.11.073
Hammer, B., Tiňo, P.: Recurrent neural networks with small weights implement definite memory machines. Neural Comput. 15(8), 1897–1929 (2003). https://doi.org/10.1162/08997660360675080
Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Technical report 148, German National Research Institute for Computer Science (2001)
Jaeger, H.: Short term memory in echo state networks. Technical report 152, German National Research Institute for Computer Science (2002)
Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004). https://doi.org/10.1126/science.1091277
LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015). https://doi.org/10.1038/nature14539
Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_36
Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009). https://doi.org/10.1016/j.cosrev.2009.03.005
Mackey, M.C., Glass, L.: Oscillation and chaos in physiological control systems. Science 197(4300), 287–289 (1977). https://doi.org/10.1126/science.267326
Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)
Verstraeten, D., Schrauwen, B., d’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007)
Weigend, A.S.: Time Series Prediction: Forecasting The Future And Understanding The Past. Routledge, May 2018. https://doi.org/10.4324/9780429492648
White, O.L., Lee, D.D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Phys. Rev. Lett. 92(14), 148102 (2004). https://doi.org/10.1103/physrevlett.92.148102
Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012)
Acknowledgements
This work was partially funded by the project BrAID under the Bando Ricerca Salute 2018 - Regional public call for research and development projects aimed at supporting clinical and organizational innovation processes of the Regional Health Service - Regione Toscana.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Tortorella, D., Gallicchio, C., Micheli, A. (2022). Hierarchical Dynamics in Deep Echo State Networks. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13531. Springer, Cham. https://doi.org/10.1007/978-3-031-15934-3_55
Download citation
DOI: https://doi.org/10.1007/978-3-031-15934-3_55
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-15933-6
Online ISBN: 978-3-031-15934-3
eBook Packages: Computer ScienceComputer Science (R0)