Information Enhancement Learning: Local Enhanced Information to Detect the Importance of Input Variables in Competitive Learning | SpringerLink
Skip to main content

Information Enhancement Learning: Local Enhanced Information to Detect the Importance of Input Variables in Competitive Learning

  • Conference paper
Engineering Applications of Neural Networks (EANN 2009)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 43))

  • 1403 Accesses

Abstract

In this paper, we propose a new information-theoretic method called ”information enhancement learning” to realize competitive learning and self-organizing maps. In addition, we propose a computational method to detect the importance of input variables and to find the optimal input variables. In our information enhancement learning, there are three types of information, namely, self-enhancement, collective enhancement and local enhancement. With self-enhancement and collective enhancement, we can realize self-organizing maps. In addition, we use local enhanced information to detect the importance of input units or input variables. Then, the variance of local information is used to determine the optimal values of the enhanced information. We applied the method to an artificial data. In the problem, information enhancement learning was able to produce self-organizing maps close to those produced by the conventional SOM. In addition, the importance of input variables detected by local enhanced information corresponded to the importance obtained by directly computing errors.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 11439
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 14299
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Kamimura, R.: Feature discovery by enhancement and relaxation of competitive units. In: Fyfe, C., Kim, D., Lee, S.-Y., Yin, H. (eds.) IDEAL 2008. LNCS, vol. 5326, pp. 148–155. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  2. Kohonen, T.: The self-organizing maps. Proceedings of the IEEE 78(9), 1464–1480 (1990)

    Article  Google Scholar 

  3. Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (1995)

    Book  MATH  Google Scholar 

  4. Guyon, I., Elisseeff, A.: An introduction of variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    MATH  Google Scholar 

  5. Rakotomamonjy, A.: Variable selection using svm-based criteria. Journal of Machine Learning Research 3, 1357–1370 (2003)

    MathSciNet  MATH  Google Scholar 

  6. Perkins, S., Lacker, K., Theiler, J.: Grafting: Fast, incremental feature selection by gradient descent in function space. Journal of Machine Learning Research 3, 1333–1356 (2003)

    MathSciNet  MATH  Google Scholar 

  7. Reunanen, J.: Overfitting in making comparison between variable selection methods. Journal of Machine Learning Research 3, 1371–1382 (2003)

    MATH  Google Scholar 

  8. Castellano, G., Fanelli, A.M.: Variable selection using neural-network models. Neurocomputing 31, 1–13 (1999)

    Article  Google Scholar 

  9. Kamimura, R.: Cooperative information maximization with gauissian activation functions for self-organizing maps. IEEE Transactions on Neural Networks 17(4), 909–919 (2006)

    Article  Google Scholar 

  10. Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output. Neural Computation 1, 402–411 (1989)

    Article  Google Scholar 

  11. Ueda, N., Nakano, R.: Deterministic annealing variant of the em algorithm. In: Advances in Neural Information Processing Systems, pp. 545–552 (1995)

    Google Scholar 

  12. Rose, K., Gurewitz, E., Fox, G.C.: Statistical mechanics and phase transition in clustering. Physical review letters 65(8), 945–948 (1990)

    Article  Google Scholar 

  13. Martinez, T.M., Berkovich, S.G., Schulten, K.J.: Neural-gas network for vector quanitization and its application to time-series prediction. IEEE transactions on neural networks 4(4), 558–569 (1993)

    Article  Google Scholar 

  14. Erdogmus, D., Principe, J.: Lower and upper bounds for misclassification probability based on renyi’s information. Journal of VLSI signal processing systems 37(2/3), 305–317 (2004)

    Article  MATH  Google Scholar 

  15. Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)

    MathSciNet  MATH  Google Scholar 

  16. Kamimura, R.: Free energy-based competitive learning for mutual information maximization. In: Proceedings of IEEE Conference on Systems, Man, and Cybernetics, pp. 223–227 (2008)

    Google Scholar 

  17. Kamimura, R.: Free energy-based competitive learning for self-organizing maps. In: Proceedings of Artificial Intelligence and Applications, pp. 414–419 (2008)

    Google Scholar 

  18. Heskes, T.: Self-organizing maps, vector quantization, and mixture modeling. IEEE Transactions on Neural Networks 12(6), 1299–1305 (2001)

    Article  Google Scholar 

  19. Versanto, J., Himberg, J., ALhoniemi, E., Parhankagas, J.: Som toolbox for matlab 5. Tech. Rep. A57, Helsinki University of Technology (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kamimura, R. (2009). Information Enhancement Learning: Local Enhanced Information to Detect the Importance of Input Variables in Competitive Learning. In: Palmer-Brown, D., Draganova, C., Pimenidis, E., Mouratidis, H. (eds) Engineering Applications of Neural Networks. EANN 2009. Communications in Computer and Information Science, vol 43. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03969-0_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-03969-0_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-03968-3

  • Online ISBN: 978-3-642-03969-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics