Abstract
Recently, Freund, Mansour and Schapire established that using exponential weighting scheme in combining classifiers reduces the problem of overfitting. Also, Helmbold, Kwek and Pitt that showed in the prediction using a pool of experts framework an instance based weighting scheme improves performance. Motivated by these results, we propose here an instance-based exponential weighting scheme in which the weights of the base classifiers are adjusted according to the test instance x. Here, a competency classifier ci is constructed for each base classifier hi to predict whether the base classifier’s guess of x’s label can be trusted and adjust the weight of hi accordingly. We show that this instance-based exponential weighting scheme enhances the performance of AdaBoost.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
L. Breiman. Bagging predictors. In Machine Learning, volume 24, pages 123–140, 1996.
T. G. Dietterich and E. B. Kong. Machine learning bias, statistical bias, and statistical variance of decision tree algorithms. Technical report, Department of Computer Science, Oregon State University, 1995.
Y. Freund. Boosting a weak learning algorithm by majority. Inform. Comput., 121(2):256–285, September 1995. Also appeared in COLT90.
Yoav Freund, Yishay Mansour, and Robert Schapire. Why averaging classifiers can protect against overfitting. In Proc. of the 8th International Workshop on Artificial Intelligence and Statistics, 2001.
Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148–146. Morgan Kaufmann, 1996.
David Helmbold, Stephen Kwek, and Leonard Pitt. Learning when to trust which experts. In Computational Learning Theory: EuroColt’ 97, pages 134–149. Springer-Verlag, 1997.
Michael I. Jordan and Robert A. Jacobs. Hierarchical mixtures of experts and the EM algorithm. CBCL Paper 83, M. I. T. Center for Biological and Computational Learning, August 1993.
Michael Kearns and Leslie Valiant. Cryptographic limitations on learning Boolean formulae and finite automata. J. ACM, 41(1):67–95, 1994.
Ron Kohavi and David H. Wolpert. Bias plus variance decomposition for zero-one loss functions. In Proc. 13th International Conference on Machine Learning, pages 275–283. Morgan Kaufmann, 1996.
Eun Bae Kong and Thomas G. Dietterich. Error-correcting output coding corrects bias and variance. In Proc. 12th International Conference on Machine Learning, pages 313–321. Morgan Kaufmann, 1995.
N. Littlestone and M. K. Warmuth. The weighted majority algorithm. Inform. Comput., 108(2):212–261, 1994.
Richard Maclin. Boosting classifiers regionally. In Proceedings of the 15th National Conference on Artificial Intelligence (AAAI-98) and of the 10th Conference on Innovative Applications of Artificial Intelligence (IAAI-98), pages 700–705, Menlo Park, July 26–30 1998. AAAI Press.
D. Optiz and R. Marlin. Popular ensemble methods: An empirical study. CBCL Paper UMD CS TR 98-1, University of Maryland, 19938.
Robert E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197–227, 1990.
Robert E. Schapire, Yoav Freund, Peter Bartlett, and Wee Sun Lee. Boosting the margin: a new explanation for the effectiveness of voting methods. In Proc. 14th International Conference on Machine Learning, pages 322–330. Morgan Kaufmann, 1997.
Ljupco Todorovski and Saso Dzeroski. Combining classifiers with meta decision trees. In Machine Learning Journal, 2002. to appear.
L. G. Valiant. A theory of the learnable. Commun. ACM, 27(11):1134–1142, November 1984.
I. H. Witten and E. Frank. Nuts and bolts: Machine learning algorithms in java,. In Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations., pages 265–320. Morgan Kaufmann, 2000.
D. Wolpert. Stacked generalization. Neural Networks, 5(2):241–260, 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kwek, S., Nguyen, C. (2002). iBoost: Boosting Using an instance-Based Exponential Weighting Scheme. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_21
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_21
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive