Abstract
We investigate the evolution of performance of finite-context predictive models built upon the recurrent activations of the two types of recurrent neural networks (RNNs), which are trained on strings generated according to the Reber grammar. The first type is a 2nd-order version of the Elman simple RNN trained to perform the next-symbol prediction in a supervised manner. The second RNN is an interesting unsupervised alternative, e.g. the 2nd-order RNN trained by the Bienenstock, Cooper and Munro (BCM) rule [3]. The BCM learning rule seems to fail to organize the RNN state space so as to represent the states of the Reber automaton. However, both RNNs behave as nonlinear iteration function systems (IFSs) and for a large enough number of quantization centers, they give an optimal prediction performance.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bachman, C.M., Musman, S.A., Luong, D., Shultz, A.: Unsupervised BCM projection pursuit algorithms for classification of simulated radar presentations. Neural Networks 7 (1994) 709–728
Beňušková, L., Diamond, M.E., Ebner F.F.: Dynamic synaptic modification threshold: computational model of experience-dependent plasticity in adult rat barrel cortex. Proc. Natl. Acad. Sci. USA 91 (1994) 4791–4795
Bienenstock, E.L., Cooper, L.N, Munro, P.W.: Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci. 2 (1982) 32–48
Cleeremans, A., Servan-Schreiber, D., McClelland, J.L.: Finite state automata and simple recurrent networks. Neural Comp. 1 (1989) 372–381
Das, S., Das, R.: Induction of discrete-state machine by stabilizing a simple recurrent network using clustering. Comp. Sci. Inf. 21 (1991) 35–40
Kolen, J.F.: The origin of clusters in recurrent neural network space. In: The Proceedings of the 16th Annual Conference of the Cognitive Science Society, Atlanta, GA, August 13–16, 1994
Ron, D., Singer, E., Tishby, N.: The power of amnesia: learning probabilistic automata with variable memory length. Machine Learning 25 (1996) 117–149.
Tiňo, P., Köteleš, M.: Extracting finite state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Trans. Neural Net. 10(1999)284–302
Tiňo, P., Stančík, M., Beňušková, L.: Building predictive models on complex symbolic sequences with a second-order recurrent BCM network with lateral inhibition. Proc. Intl. Join Conf. Neural Net. vol. 2 (2000) 265–270
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cerňanský, M., Benuškov, L. (2001). Finite-State Reber Automaton and the Recurrent Neural Networks Trained in Supervised and Unsupervised Manner. In: Dorffner, G., Bischof, H., Hornik, K. (eds) Artificial Neural Networks — ICANN 2001. ICANN 2001. Lecture Notes in Computer Science, vol 2130. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44668-0_103
Download citation
DOI: https://doi.org/10.1007/3-540-44668-0_103
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42486-4
Online ISBN: 978-3-540-44668-2
eBook Packages: Springer Book Archive