A New Approach to Division of Attribute Space for SVR Based Classification Rule Extraction | SpringerLink
Skip to main content

A New Approach to Division of Attribute Space for SVR Based Classification Rule Extraction

  • Conference paper
Advances in Neural Networks - ISNN 2008 (ISNN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5263))

Included in the following conference series:

  • 3061 Accesses

Abstract

SVM based rule extraction has become an important preprocessing technique for data mining, pattern classification, and so on. There are two key problems required to be solved in the classification rule extraction based on SVMs, i.e. the attribute importance ranking and the discretization to continuous attributes. In the paper, firstly, a new measure for determining the importance level of the attributes based on the trained SVR (Support vector re-gression) classifiers is proposed. Based on this new measure, a new approach for the division to continuous attribute space based on support vectors is pre-sented. A new approach for classification rule extraction from trained SVR classifiers is given. The performance of the new approach is demonstrated by several computing cases. The experimental results prove that the proposed ap-proach proposed can improve the validity of the extracted classification rules remarkably compared with other constructing rule approaches, especially for complicated classification problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 14299
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Fu, L.: Rule Generation from Neural Networks. IEEE Trans. Systems Man. Cybernet 24, 1114–1124 (1994)

    Article  Google Scholar 

  2. Towell, G.G., Shavlik, J.W.: Extracting Refined Rules from Knowledge-based Neural Networks. Machine Learning 13, 71–101 (1993)

    Google Scholar 

  3. Lu, H.J., Setiono, R., Liu, H.: NeuroRule: A Connectionist Approach to Data Mining. In: Proceedings of 21th International Conference on Very Large Data Bases, Zurich, Switzerland, pp. 478–489 (1995)

    Google Scholar 

  4. Zhou, Z.H., Jiang, Y., Chen, S.F.: Extracting Symbolic Rules from Trained Neural Network Ensembles. AI Communications 16, 3–15 (2003)

    Google Scholar 

  5. Sestito, S., Dillon, T.: Knowledge Acquisition of Conjunctive Rules Using Multilayered Neural Networks. International Journal of Intelligent Systems 8, 779–805 (1993)

    Article  Google Scholar 

  6. Craven, M.W., Shavlik, J.W.: Using Sampling and Queries to Extract Rules from Trained Neural Networks. In: Proceedings of the 11th International Conference on Machine Learning, New Brunswick, NJ, USA, pp. 37–45 (1994)

    Google Scholar 

  7. Maire, F.: Rule-extraction by Backpropagation of Polyhedra. Neural Networks 12, 717–725 (1999)

    Article  Google Scholar 

  8. Setiono, R., Leow, W.K.: On Mapping Decision Trees and Neural Networks. Knowledge Based Systems 12, 95–99 (1999)

    Article  Google Scholar 

  9. Battiti, R.A.: Using Mutual Information for Selecting Featuring in Supervised Net Neural Learning. IEEE Trans. on Neural Networks 5, 537–550 (1994)

    Article  Google Scholar 

  10. Bollacker, K.D., Ghosh, J.C.: Mutual Information Feature Extractors for Neural Classifiers. In: Proceedings of IEEE Int. Conference on Neural Networks, vol. 3, pp. 1528–1533 (1996)

    Google Scholar 

  11. Dash, M., Liu, H., Yao, J.C.: Dimensionality Reduction of Unsupervised Data. In: Proceedings of 9th IEEE Int. Conf. on Tools of Artificial Intell., pp. 532–539 (1997)

    Google Scholar 

  12. Fu, X.J., Wang, L.P.: Data Dimensionality Reduction with Application to Simplifying RBF Network Structure and Improving Classification Performance. IEEE Trans. System, Man, Cybern, Part B-Cybernetics 33, 399–409 (2003)

    Article  Google Scholar 

  13. Zhang, Y., Su, H.Y., Jia, T., Chu, J.C.: Rule Extraction from Trained Support Vector Machines. In: Ho, T.-B., Cheung, D., Liu, H. (eds.) PAKDD 2005. LNCS (LNAI), vol. 3518, pp. 61–70. Springer, Heidelberg (2005)

    Google Scholar 

  14. Kamarthi, S.V., Pittner, S.: Accelerating Neural Network Training Using Weight Extrapolation. Neural Networks 12, 1285–1299 (1999)

    Article  Google Scholar 

  15. Blake, C., Keogh, E., Merz, C.J.: UCI Repository of Machine Learning Databases, Department of Information and Computer Science, University of California, Irvine, CA, USA (1998), http://www.ics.uci.edu/~meearn/MLRepository.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, D., Duan, A., Fan, Y., Wang, Z. (2008). A New Approach to Division of Attribute Space for SVR Based Classification Rule Extraction. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_77

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87732-5_77

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87731-8

  • Online ISBN: 978-3-540-87732-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics