A Comparison of Methods for Learning of Highly Non-separable Problems | SpringerLink
Skip to main content

A Comparison of Methods for Learning of Highly Non-separable Problems

  • Conference paper
Artificial Intelligence and Soft Computing – ICAISC 2008 (ICAISC 2008)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5097))

Included in the following conference series:

Abstract

Learning in cases that are almost linearly separable is easy, but for highly non-separable problems all standard machine learning methods fail. Many strategies to build adaptive systems are based on the “divide-and-conquer” principle. Constructive neural network architectures with novel training methods allow to overcome some drawbacks of standard backpropagation MLP networks. They are able to handle complex multidimensional problems in reasonable time, creating models with small number of neurons. In this paper a comparison of our new constructive c3sep algorithm based on k-separability idea with several sequential constructive learning methods is reported. Tests have been performed on parity function, 3 artificial Monks problems, and a few benchmark problems. Simple and accurate solutions have been discovered using c3sep algorithm even in highly non-separable cases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Cherkassky, V., Mulier, F.: Learning from data. In: Adaptive and learning systems for signal processing, communications and control. John Wiley & Sons, Inc., New York (1998)

    Google Scholar 

  2. Duch, W.: K-Separability. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 188–197. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  3. Grochowski, M., Duch, W.: Learning Highly Non-separable Boolean Functions Using Constructive Feedforward Neural Network. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 180–189. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  4. Duch, W., Setiono, R., Zurada, J.: Computational intelligence methods for understanding of data. Proceedings of the IEEE 92(5), 771–805 (2004)

    Article  Google Scholar 

  5. Muselli, M.: Sequential constructive techniques. In: Leondes, C. (ed.) Optimization Techniques. Neural Network Systems, Techniques and Applications, vol. 2, pp. 81–144. Academic Press, San Diego (1998)

    Google Scholar 

  6. Young, S., Downs, T.: Improvements and extensions to the constructive algorithm carve. In: Vorbrüggen, J.C., von Seelen, W., Sendhoff, B. (eds.) ICANN 1996. LNCS, vol. 1112, pp. 513–518. Springer, Heidelberg (1996)

    Google Scholar 

  7. Muselli, M.: On sequential construction of binary neural networks. IEEE Transactions on Neural Networks 6(3), 678–690 (1995)

    Article  MathSciNet  Google Scholar 

  8. Campbell, C., Vicente, C.: The target switch algorithm: a constructive learning procedure for feed-forward neural networks. Neural Computations 7(6), 1245–1264 (1995)

    Article  Google Scholar 

  9. Mascioli, F.M.F., Martinelli, G.: A constructive algorithm for binary neural networks: The oil-spot algorithm. IEEE Transactions on Neural Networks 6(3), 794–797 (1995)

    Article  Google Scholar 

  10. Marchand, M., Golea, M.: On learning simple neural concepts: from halfspace intersections to neural decision lists. Network: Computation in Neural Systems 4, 67–85 (1993)

    Article  Google Scholar 

  11. Zollner, R., Schmitz, H.J., Wünsch, F., Krey, U.: Fast generating algorithm for a general three-layer perceptron. Neural Networks 5(5), 771–777 (1992)

    Article  Google Scholar 

  12. Duch, W., Jankowski, N.: Survey of neural transfer functions. Neural Computing Surveys 2, 163–213 (1999)

    Google Scholar 

  13. Duch, W.: Towards comprehensive foundations of computational intelligence. In: Duch, W., Mandziuk, J. (eds.) Challenges for Computational Intelligence, vol. 63, pp. 261–316. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  14. Thurn, S.: The monk’s problems: a performance comparison of different learning algorithms. Technical Report CMU-CS-91-197, Carnegie Mellon University (1991)

    Google Scholar 

  15. Muselli, M., Liberati, D.: Binary rule generation via hamming clustering. IEEE Transactions on Knowledge and Data Engineering 14, 1258–1268 (2002)

    Article  Google Scholar 

  16. Merz, C., Murphy, P.: UCI repository of machine learning databases (1998-2004), http://www.ics.uci.edu/~mlearn/MLRepository.html

  17. Duch, W.: Support Vector Neural Training. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3697, pp. 67–72. Springer, Heidelberg (2005)

    Google Scholar 

  18. Murthy, S., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2, 1–32 (1994)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Leszek Rutkowski Ryszard Tadeusiewicz Lotfi A. Zadeh Jacek M. Zurada

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Grochowski, M., Duch, W. (2008). A Comparison of Methods for Learning of Highly Non-separable Problems. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing – ICAISC 2008. ICAISC 2008. Lecture Notes in Computer Science(), vol 5097. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69731-2_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69731-2_55

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69572-1

  • Online ISBN: 978-3-540-69731-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics