Abstract
VSF–Network, Vibration Synchronizing Function Network, is a hybrid neural network combining Chaos Neural Network and hierarchical neural network. VSF–Network is designed for symbol learning. VSF–Network finds unknown parts of input data by comparing to stored pattern and it learns unknown patterns using unused part of the network. New patterns are learned incrementally and they are stored as sub-networks . Combinations of patterns are represented as combinations of the sub-networks. In this paper, the two theoretical backgrounds of VSF–Network are introduced. At the first, an incremental learning framework with Chaos Neural Networks is introduced. Next, the pattern recognition with the combined with symbols is introduced. From the viewpoints of9 differential topology and mixture distribution, the combined pattern recognition by VSF-Network is explained. Through an experiment, both the incremental learning capability and the pattern recognition with pattern combination are shown.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Kakemoto, Y., Nakasuka, S.: The dynamics of incremental learning by vsf-network. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009, Part I. LNCS, vol. 5768, pp. 688–697. Springer, Heidelberg (2009)
Kakemoto, Y., Nakasuka, S.: Neural assembly generation by selective connection weight updating. In: Proc. IjCNN 2010 (2010)
Inamura, T., Tanie, H., Nakamura, Y.: Proto-symbol development and manipulation in the geometry of stochastic model for motion generation and recognition. Technical Report NC2003-65, IEICE (2003)
Chandler, D.: Semiotics for Beginners. Routledge (1995)
Giraud-Carrier, C.: A note on the utility of incremental learning. AI Communications 13, 215–223 (2000)
Lin, M., Tang, K., Yao, X.: Incremental learning by negative correlation leaning. In: Proc. of IJCNN 2008 (2008)
Aihara, T., Tanabe, T., Toyoda, M.: Chaotic neural networks. Phys. Lett. 144A, 333–340 (1990)
Uchiyama, S., Fujisaki, H.: Chaotic itinerancy in the oscillator neural network without lyapunov functions. Chaos 14, 699–706 (2004)
Hopfield, J.: Neurons with graded response have collective computational properties like those of two-stage neurons. Proceedings of the National Academy of Sciences of U.S.A. 81, 13088–13092 (1984)
Kaneko, K.: Chaotic but regular posi-nega switch among coded attractors by cluster size variation. Phys. Rev. Lett. 63, 219 (1989)
Komuro, M.: A mechanism of chaotic itinerancy in globally coupled maps. In: Dynamical Systems (NDDS 2002) (2002)
Cobb, L., Ragade, R.: Applications of catastrophe theory in the behavioral and life. Behavioral Science 79(23), 291 (1978)
Cobb, L., Watson, B.: Statistical catastrophe theory: An overview. Mathematical Modellin 23(8), 1–27 (1980)
Thom, R.: Stability and Morphogenesis.: Essai D’une Theorie Generale Des Modeles. W. A. Benjamin, California (1973)
Grasman, R., van der Maas, H., Wagenmakers, E.: Journal of statistical software. Mathematical Modelling 32, 1–27 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer International Publishing Switzerland
About this paper
Cite this paper
Kakemoto, Y., Nakasuka, S. (2013). Selective Weight Update for Neural Network – Its Backgrounds. In: Yoshida, T., Kou, G., Skowron, A., Cao, J., Hacid, H., Zhong, N. (eds) Active Media Technology. AMT 2013. Lecture Notes in Computer Science, vol 8210. Springer, Cham. https://doi.org/10.1007/978-3-319-02750-0_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-02750-0_12
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-02749-4
Online ISBN: 978-3-319-02750-0
eBook Packages: Computer ScienceComputer Science (R0)