Abstract
SVM based rule extraction has become an important preprocessing technique for data mining, pattern classification, and so on. There are two key problems required to be solved in the classification rule extraction based on SVMs, i.e. the attribute importance ranking and the discretization to continuous attributes. In the paper, firstly, a new measure for determining the importance level of the attributes based on the trained SVR (Support vector re-gression) classifiers is proposed. Based on this new measure, a new approach for the division to continuous attribute space based on support vectors is pre-sented. A new approach for classification rule extraction from trained SVR classifiers is given. The performance of the new approach is demonstrated by several computing cases. The experimental results prove that the proposed ap-proach proposed can improve the validity of the extracted classification rules remarkably compared with other constructing rule approaches, especially for complicated classification problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fu, L.: Rule Generation from Neural Networks. IEEE Trans. Systems Man. Cybernet 24, 1114–1124 (1994)
Towell, G.G., Shavlik, J.W.: Extracting Refined Rules from Knowledge-based Neural Networks. Machine Learning 13, 71–101 (1993)
Lu, H.J., Setiono, R., Liu, H.: NeuroRule: A Connectionist Approach to Data Mining. In: Proceedings of 21th International Conference on Very Large Data Bases, Zurich, Switzerland, pp. 478–489 (1995)
Zhou, Z.H., Jiang, Y., Chen, S.F.: Extracting Symbolic Rules from Trained Neural Network Ensembles. AI Communications 16, 3–15 (2003)
Sestito, S., Dillon, T.: Knowledge Acquisition of Conjunctive Rules Using Multilayered Neural Networks. International Journal of Intelligent Systems 8, 779–805 (1993)
Craven, M.W., Shavlik, J.W.: Using Sampling and Queries to Extract Rules from Trained Neural Networks. In: Proceedings of the 11th International Conference on Machine Learning, New Brunswick, NJ, USA, pp. 37–45 (1994)
Maire, F.: Rule-extraction by Backpropagation of Polyhedra. Neural Networks 12, 717–725 (1999)
Setiono, R., Leow, W.K.: On Mapping Decision Trees and Neural Networks. Knowledge Based Systems 12, 95–99 (1999)
Battiti, R.A.: Using Mutual Information for Selecting Featuring in Supervised Net Neural Learning. IEEE Trans. on Neural Networks 5, 537–550 (1994)
Bollacker, K.D., Ghosh, J.C.: Mutual Information Feature Extractors for Neural Classifiers. In: Proceedings of IEEE Int. Conference on Neural Networks, vol. 3, pp. 1528–1533 (1996)
Dash, M., Liu, H., Yao, J.C.: Dimensionality Reduction of Unsupervised Data. In: Proceedings of 9th IEEE Int. Conf. on Tools of Artificial Intell., pp. 532–539 (1997)
Fu, X.J., Wang, L.P.: Data Dimensionality Reduction with Application to Simplifying RBF Network Structure and Improving Classification Performance. IEEE Trans. System, Man, Cybern, Part B-Cybernetics 33, 399–409 (2003)
Zhang, Y., Su, H.Y., Jia, T., Chu, J.C.: Rule Extraction from Trained Support Vector Machines. In: Ho, T.-B., Cheung, D., Liu, H. (eds.) PAKDD 2005. LNCS (LNAI), vol. 3518, pp. 61–70. Springer, Heidelberg (2005)
Kamarthi, S.V., Pittner, S.: Accelerating Neural Network Training Using Weight Extrapolation. Neural Networks 12, 1285–1299 (1999)
Blake, C., Keogh, E., Merz, C.J.: UCI Repository of Machine Learning Databases, Department of Information and Computer Science, University of California, Irvine, CA, USA (1998), http://www.ics.uci.edu/~meearn/MLRepository.htm
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, D., Duan, A., Fan, Y., Wang, Z. (2008). A New Approach to Division of Attribute Space for SVR Based Classification Rule Extraction. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_77
Download citation
DOI: https://doi.org/10.1007/978-3-540-87732-5_77
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87731-8
Online ISBN: 978-3-540-87732-5
eBook Packages: Computer ScienceComputer Science (R0)