Abstract
Hyper Surface Classification (HSC), which is based on Jordan Curve Theorem in Topology, is one of the accurate and efficient classification algorithms. The hyper surface obtained by the training process exhibits excellent generalization performance on datasets not only of large size but also of high dimensionality. The classification knowledge hidden in the classifier, however, is hard to interpret by human. How to obtain the classification rules is an important problem. In this paper, we firstly extract rule from the sample directly. In order to avoid rule redundance, two optimal policies, selecting Minimal Consistent Subset (MCS) for the training set and merging some neighboring cubes, are exerted to reduce the rules set. Experimental results show that the two policies are able to accurately acquire the knowledge implied by the hyper surface and express the good generalization performance of HSC. Moreover, the time for classifying the unlabeled sample by the rules set can be shorten correspondingly.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
He, Q., Shi, Z.Z., Ren, L.A.: The Classification Method Based on Hyper Surface. In: Proc. 2002 IEEE Int. Joint Conference on Neural Networks, pp. 1499–1503 (2002)
He, Q., Shi, Z.Z., Ren, L.A., Lee, E.S.: A Novel Classification Method Based on Hyper surface. Int. J. of Mathematical and Computer Modeling 38, 395–407 (2003)
He, Q., Zhao, X.R., Shi, Z.Z.: Classification based on dimension transposition for high dimension data. Soft Computing 11, 329–334 (2006)
Zhao, X.R., He, Q., Shi, Z.Z.: Hyper surface classifiers ensemble for high dimensional data sets. In: 3rd Int. Symp. Neural Networks, pp. 1299–1304 (2006)
Gallant, S.I.: Connectionist expert system. Communications of the ACM 31, 152–169 (1988)
Andrews, R., Diederich, J., Tickle, A.B.: Survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based System 8, 373–389 (1995)
Núñez, H., Angulo, C., Català, A.: Rule extraction from support vector machines. In: Proc. 2002 European Symposium on Artificial Neural Networks, pp. 107–112 (2002)
Fountoukis, S.G., Bekakos, M.P., Kontos, J.P.: Rule extraction from decision trees with complex nominal data. Parallel & Scientific Computations 9, 119–128 (2001)
Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inform. Th. IT 214, 515–516 (1968)
Gates, G.W.: The reduced nearest neighbor rule. IEEE Trans. Inform. Th. IT 218, 431–433 (1972)
Cohen, W.W.: Fast effective rule induction. In: In Machine Learning: Proc. of the Twelfth International Conference, pp. 115–123 (1995)
Gao, B.J., Ester, M., Fraser, S., Schulte, O., Xiong, H.: The Minimum Consistent Subset Cover Problem and its Applications in Data Mining. In: Proc. the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 310–319 (2007)
Dasarathy, B.V.: Minimal Consistent Set Identification for Optimal Nearest Neighbor Decision Systems Design. IEEE Trans. on System, Man, and Cybernetics 24, 511–517 (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
He, Q., Li, J., Shi, Z. (2009). Rule Extraction and Reduction for Hyper Surface Classification. In: Yu, W., He, H., Zhang, N. (eds) Advances in Neural Networks – ISNN 2009. ISNN 2009. Lecture Notes in Computer Science, vol 5552. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01510-6_11
Download citation
DOI: https://doi.org/10.1007/978-3-642-01510-6_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-01509-0
Online ISBN: 978-3-642-01510-6
eBook Packages: Computer ScienceComputer Science (R0)