Abstract
Boosting is an effective classifier combination method, which can improve classification performance of an unstable learning algorithm. But it does not have much more improvements on a stable learning algorithm. TAN, Tree-Augmented Naive Bayes, is a tree-like Bayesian network. The standard TAN learning algorithm generates a stable TAN classifier, which is difficult to improve its accuracy by boosting technique. In this paper, a new TAN learning algorithm called GTAN and a TAN classifier combination method called Boosting-MultiTAN are presented. Through comparisons of this TAN classifier combination method with the standard TAN classifier in the experiments, the Boosting-MultiTAN shows higher classification accuracy than the standard TAN classifier on the most data sets.
Preview
Unable to display preview. Download preview PDF.
References
Schapire, R.E., The strength of weak learnability. Machine Learning, 5(1990), 197–227.
Freund, Y. and Schapire, R.E., Experiments with a new Boosting algorithm, Proceedings of the Thirteenth International Conference on machine learning, San Francisco, CA: Morgan Kaufmann(1996), 148–156.
Freund, Y., An adaptive version of the Boost by majority algorithm, Proceedings of the twelfth Annual Conference on Computational Learning Theory(1999)
Bauer, E. and Kohavi, R., An empirical comparison of voting classification algorithms: Bagging, boosting, and variants, Machine Learning, 36(1/2) (1999), 105–139.
Quinlan, J.R., Bagging, Boosting, and C4.5, Proceedings of the Thirteenth National Conference on Artificial Intelligence, Menlo Park, CA: AAAI Press(1996), 725–730.
Friedman, N., Geiger, D., and Goldszmidt, M., Bayesian network classifiers. Machine Learning, 29(1997), 131–163.
Keogh, E. J., Pazzani, M. J.: Learning Augmented Bayesian Classifiers: A Comparison of Distribution-Based and Classification-Based Approaches. In: Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics. (1999) 225–230
Ting, K.M. and Zheng, Z., Improving the performance of boosting for naive Bayesian classification. Proceedings of the Third Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD-99), Berlin: Springer-Verlag(1999), 296–305.
Zheng, Z., Naive Bayesian Classifier Committees. Proceedings of ECML’98, Berlin: Springer Verlag, (1998), 196–207.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shi, H., Wang, Z., Huang, H. (2003). Improving Classification Performance by Combining Multiple TAN Classifiers. In: Wang, G., Liu, Q., Yao, Y., Skowron, A. (eds) Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing. RSFDGrC 2003. Lecture Notes in Computer Science(), vol 2639. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-39205-X_105
Download citation
DOI: https://doi.org/10.1007/3-540-39205-X_105
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-14040-5
Online ISBN: 978-3-540-39205-7
eBook Packages: Springer Book Archive