Abstract
This paper presents a novel connectionist memory-rule based model capable of learning the finite-state properties of an input language from a set of positive examples. The model is based upon an unsupervised recurrent self-organizing map [1] with laterally interconnected neurons. A derivation of functional-equivalence theory [2] is used that allows the model to exploit similarities between the fixture context of previously memorized sequences and the future context of the current input sequence. This bottom-up learning algorithm binds functionally-related neurons together to form states. Results show that the model is able to learn the Reber grammar [3] perfectly from a randomly generated training set and to generalize to sequences beyond the length of those found in the training set.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
McQueen, T. & Hopgood, A. & Tepper, J. & Allen, T. A Recurrent self-organizing map for Temporal Sequence Processing. In: Proceedings of 4th International Conference in Recent Advances in Soft Computing (RASC2002), Nottingham, 2002
Hopcroft J. & Ullman J. Introduction to Automata Theory, Languages and Computation, vol 1, Addison-Wesley, 1979
Cleeremans A, Schreiber D, McClelland J. Finite State Automata and Simple Recurrent Networks. In: Neural Computation. 1989, Vol 1, pp 372–381
Collier R. An historical overview of natural language processing systems that learn. Artificial Intelligence Review 1994; 8(1)
Chomsky, N. Aspects of the Theory of Syntax. MIT Press, 1965
Gold, E.M. Language Identification in the Limit. Information and Control 1967; 10:447–474
Homing, J.J. A study of grammatical inference. PhD thesis, Stanford University, California, 1969
Elman, J.L. Finding Structure in Time. Cognitive Science 1990; 14:179–211
Omlin, C. Understanding and Explaining DRN Behaviour. In: Kolen, J. and Kremer S (eds) A Field Guide to Dynamical Recurrent Networks. IEEE Press, New York, 2001, pp 207–227
Kolen, J. Fool’s Gold: Extracting Finite State Machines From Recurrent Network Dynamics. In: Cowan J, Tesauro G and Alspector J (eds) Advances in Neural Information Processing Systems 6. Morgan Kaufmann, San Francisco CA, 1994, pp 501–508
Kohonen T. Self-Organizing Maps, vol 1. Springer-Verlag, Germany, 1995
Baretto, G and Arajo, A. Time in Self-Organizing Map: An Overview of Models. International Journal of Computer Research: Special Edition on Neural Networks: Past, Present and Future 2001; 10(2):139–179
Pinker, S. Words and Rules. Phoenix, London, 2000
Marcus, G. F. Children’s Overregularization and Its Implications for Cognition. In: P. Broeder and J. Murre (eds) Models of Language Acquisition: Inductive and Deductive approaches. Oxford University Press, Oxford, 2000, pp 154–176
Cohen, N.J. and Squire, L.R. Preserved learning and retention of pattern-analyzing skill in amnesia: Dissociation of knowing how and knowing that. Science 1980; 21:207–210
Voegtlin, T. Recursive Self-Organizing Maps. Neural Networks 2002; 15(8–9):979–991
Hopgood, A. A. Intelligent Systems for Engineers and Scientists, 2nd edition, CRC Press LLC, Florida, 2001, pp 195–222
Sharkey N, Sharkey A, Jackson S. Are SRNs sufficient for modelling language acquisition?. In: Broeder P, Murre J. (eds) Models of Language Acquisition: Inductive and Deductive Approaches. Oxford University Press, Oxford, 2000, pp 33–54
Lawrence S, Giles C, Fong S. Natural Language Grammatical Inference with Recurrent Neural Networks. IEEE Transactions on Knowledge and Data Engineering 2000; 12(1): 126–140
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag London Limited
About this paper
Cite this paper
McQueen, T., Hopgood, A.A., Allen, T.J., Tepper, J.A. (2005). Extracting Finite Structure from Infinite Language. In: Bramer, M., Coenen, F., Allen, T. (eds) Research and Development in Intelligent Systems XXI. SGAI 2004. Springer, London. https://doi.org/10.1007/1-84628-102-4_1
Download citation
DOI: https://doi.org/10.1007/1-84628-102-4_1
Publisher Name: Springer, London
Print ISBN: 978-1-85233-907-4
Online ISBN: 978-1-84628-102-0
eBook Packages: Computer ScienceComputer Science (R0)