Abstract
We investigate various methods of combining Echo State Networks (ESNs), including a method that we dub Restricted ESNs. We provide a notation for describing Restricted ESNs, and use it to benchmark a standard ESN against restricted ones. We investigate two methods to keep the weight matrix density consistent when comparing a Restricted ESN to a standard one, which we call “overall consistency” and “patch consistency”. We benchmark restricted ESNs on NARMA10 and the sunspot prediction benchmark, and find that restricted ESNs perform similarly to standard ones. We present some application scenarios in which restricted ESNs may offer advantages over standard ESNs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
The grid search is modified to split the range of f into 10 and use that as the initial step, and then split the range between the optimal value and its neighbour into 10 for the secondary step.
References
Atiya, A.F., Parlos, A.G.: New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE TNN 11(3), 697–709 (2000)
Butcher, J.B., Verstraeten, D., Schrauwen, B., Haycock, P.W.: Extending reservoir computing with random static projections. In: ESANN 2010, pp. 303–308 (2010)
Caluwaerts, K., D’Haene, M., Verstraeten, D., Schrauwen, B.: Locomotion without a brain: physical reservoir computing in tensegrity structures. Artif. Life 19(1), 35–66 (2013)
Canaday, D., Pomerance, A., Gauthier, D.J.: Model-free control of dynamical systems with deep reservoir computing. J. Phys. Complex. 2(3), 035025 (2021)
Dale, M.: Neuroevolution of hierarchical reservoir computers. In: GECCO 2018, pp. 410–417. ACM (2018)
Dale, M., Miller, J.F., Stepney, S., Trefzer, M.A.: Evolving carbon nanotube reservoir computers. In: Amos, M., Condon, A. (eds.) UCNC 2016. LNCS, vol. 9726, pp. 49–61. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-41312-9_5
Dale, M., O’Keefe, S., Sebald, A., Stepney, S., Trefzer, M.A.: Computing with magnetic thin films: using film geometry to improve dynamics. In: Kostitsyna, I., Orponen, P. (eds.) UCNC 2021. LNCS, vol. 12984, pp. 19–34. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87993-8_2
Deng, Z., Zhang, Y.: Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE TNN 18(5), 1364–1375 (2007)
Fernando, C., Sojakka, S.: Pattern recognition in a bucket. In: Banzhaf, W., Ziegler, J., Christaller, T., Dittrich, P., Kim, J.T. (eds.) ECAL 2003. LNCS (LNAI), vol. 2801, pp. 588–597. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39432-7_63
Gallicchio, C., Micheli, A.: Architectural and Markovian factors of echo state networks. Neural Netw. 24(5), 440–456 (2011)
Gallicchio, C., Micheli, A.: Echo state property of deep reservoir computing networks. Cognit. Comput. 9(3), 337–350 (2017)
Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017)
Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks - with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148(34), 13 (2001)
Jaeger, H.: Discovering multiscale dynamical features with hierarchical echo state networks. Technical report TR-10, Jacobs University Bremen (2007)
Jaeger, H., Maass, W., Principe, J.: Special issue on echo state networks and liquid state machines. Neural Netw. 20(3), 287–289 (2007)
Jarvis, S., Rotter, S., Egert, U.: Extending stability through hierarchical clusters in echo state networks. Front. Neuroinform. 4 (2010)
Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_36
Ma, Q., Chen, E., Lin, Z., Yan, J., Yu, Z., Ng, W.W.Y.: Convolutional multitimescale echo state network. IEEE Trans. Cybern. 51(3), 1613–1625 (2021)
Ma, Q., Shen, L., Cottrell, G.W.: Deep-ESN: a multiple projection-encoding hierarchical reservoir computing framework. arXiv:1711.05255 [cs.LG] (2017)
Ma, Q., Shen, L., Zhuang, W., Chen, J.: Decouple adversarial capacities with dual-reservoir network. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, E.-S.M. (eds.) ICONIP 2017. LNCS, vol. 10638, pp. 475–483. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70139-4_48
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states. Neural Comput. 14(11), 2531–2560 (2002)
Malik, Z.K., Hussain, A., Wu, Q.J.: Multilayered echo state machine: a novel architecture and algorithm. IEEE Trans. Cybern. 47(4), 946–959 (2017)
Rodan, A., Tino, P.: Minimum complexity echo state network. IEEE TNN 22(1), 131–144 (2011)
Rodriguez, N., Izquierdo, E., Ahn, Y.Y.: Optimal modularity and memory capacity of neural reservoirs. Netw. Neurosci. 3(2), 551–566 (2019)
Schwenker, F., Labib, A.: Echo state networks and neural network ensembles to predict sunspots activity. In: ESANN 2009 (2009)
Stepney, S.: Non-instantaneous information transfer in physical reservoir computing. In: Kostitsyna, I., Orponen, P. (eds.) UCNC 2021. LNCS, vol. 12984, pp. 164–176. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87993-8_11
Triefenbach, F., Jalal, A., Schrauwen, B., Martens, J.P.: Phoneme recognition with large hierarchical reservoirs. Adv. Neural. Inf. Process. Syst. 23, 2307–2315 (2010)
Triefenbach, F., Jalalvand, A., Demuynck, K., Martens, J.P.: Acoustic modeling with hierarchical reservoirs. IEEE TASLP 21(11), 2439–2450 (2013)
Xue, Y., Yang, L., Haykin, S.: Decoupled echo state networks with lateral inhibition. Neural Netw. 20(3), 365–376 (2007)
Yule, G.U.: On a method of investigating periodicities in disturbed series, with special reference to Wolfer’s sunspot numbers. Phil. Trans. Roy. Soc. A 226(636–646), 267–298 (1927)
Acknowledgement
This work was made possible by PhD studentship funding from the Computer Science Department of the University of York.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
A Calculating \(D_W\) for the Overall-Consistent Case
Given an ESN with N nodes and an average density \(0 \le D \le 1\), we wish to restrict that ESN to have n subreservoirs of equal size; we assume n divides N. We set the density within the subreservoirs, \(D_W\), to be greater than the density outside the subreservoirs by a factor of f, that is, \(D_W = f D_B\).
In a restricted ESN with n subreservoirs, each of size N/n, there are n regions in the edge matrix \({\textbf {W}}\) of size \(({N}/{n})^2\) with density \(D_W\), and a further \(n^2-n\) regions also of size \(({N}/{n})^2\) with density \(D_B\).
Hence the average density D of such a restricted ESN is:
Substituting \(D_W = f D_B\), and rearranging to get an expression for \(D_B\) in terms of D, we get:
Once \(D_B\) is known, we also have \(D_W\) from \(D_W = f D_B\).
B Optimising f
In order to find the best possible restricted ESN within our constraints, we optimise over the parameter f. However, we must somehow limit our search space.
In the restricted ESN, we want \(D_B\) to be strictly less than \(D_W\) (less dense connections than subreservoirs); therefore, \(f > 1\).
To find an upper bound, we assume that every subreservoir is connected to every other subreservoir, that is, every connection weight matrix \({\textbf {B}}_{i,j}\) has at least one entry. This requires \(D_B \ge (n/N)^2\). (In the experiments, the weight matrices are generated probabilistically, so when close to this density limit, it may be the case that there is not an edge between all subreservoirs.)
Rearranging Eq. 7 gives:
The lower limit on \(D_B\) gives an upper limit on f:
We also have an upper limit on the derived density, \(D_W \le 1\) (equality implies there are no zero elements in the relevant weight matrix). Substituting for \(D_W\) in Eq. 7 gives:
Rearranging gives another upper limit on f:
Hence we have the upper and lower bounds on f:
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wringe, C., Stepney, S., Trefzer, M.A. (2023). Modelling and Evaluating Restricted ESNs. In: Genova, D., Kari, J. (eds) Unconventional Computation and Natural Computation. UCNC 2023. Lecture Notes in Computer Science, vol 14003. Springer, Cham. https://doi.org/10.1007/978-3-031-34034-5_13
Download citation
DOI: https://doi.org/10.1007/978-3-031-34034-5_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34033-8
Online ISBN: 978-3-031-34034-5
eBook Packages: Computer ScienceComputer Science (R0)